firstbacksecondback
9 Results
Workshop
|
Reverse Distillation: Training Billion Parameter Models For CTR Prediction Aditya Anantharaman · Aashiq Muhamed · Hemant Pugaliya · Chong Wang · Sujan Perera · Zhen Ge · qingjun cui · Belinda Zeng · Trishul Chilimbi |
||
Poster
|
Wed 17:00 |
Understanding the Distillation Process from Deep Generative Models to Tractable Probabilistic Circuits Xuejie Liu · Anji Liu · Guy Van den Broeck · Yitao Liang |
|
Workshop
|
One-Step Diffusion Distillation via Deep Equilibrium Models Zhengyang Geng · Ashwini Pokle · Zico Kolter |
||
Poster
|
Tue 17:00 |
On the Impact of Knowledge Distillation for Model Interpretability Hyeongrok Han · Siwon Kim · Hyun-Soo Choi · Sungroh Yoon |
|
Workshop
|
Towards Safe Self-Distillation of Internet-Scale Text-to-Image Diffusion Models Sanghyun Kim · Seohyeon Jung · Balhae Kim · Moonseok Choi · Jinwoo Shin · Juho Lee |
||
Poster
|
Wed 14:00 |
Distilling Internet-Scale Vision-Language Models into Embodied Agents Theodore R Sumers · Kenneth Marino · Arun Ahuja · Rob Fergus · Ishita Dasgupta |
|
Poster
|
Wed 14:00 |
Less is More: Task-aware Layer-wise Distillation for Language Model Compression Chen Liang · Simiao Zuo · Qingru Zhang · Pengcheng He · Weizhu Chen · Tuo Zhao |
|
Workshop
|
Collaborative Score Distillation for Consistent Visual Synthesis Subin Kim · Kyungmin Lee · June Suk Choi · Jongheon Jeong · Kihyuk Sohn · Jinwoo Shin |
||
Affinity Workshop
|
Mon 19:15 |
An Empirical Analysis Towards Replacing Vocabulary-Rigid Embeddings by a Vocabulary-Free Mechanism Alejandro Rodriguez Perez · Korn Sooksatra · Pablo Rivas · Ernesto Quevedo Caballero · Javier Turek · Gisela Bichler · Tomas Cerny · Laurie Giddens · Stacie Petter |