Results 301 to 310 of about 205,061 (335)
Some of the next articles are maybe not open access.
Multi-Level Logit Distillation
Computer Vision and Pattern Recognition, 2023Knowledge Distillation (KD) aims at distilling the knowledge from the large teacher model to a lightweight student model. Mainstream KD methods can be divided into two categories, logit distillation, and feature distillation.
Ying Jin, Jiaqi Wang, Dahua Lin
semanticscholar +1 more source
Membrane Distillation and Osmotic Distillation
2010.
Curcio E, Di Profio G, Drioli E
openaire +4 more sources
One-for-All: Bridge the Gap Between Heterogeneous Architectures in Knowledge Distillation
Neural Information Processing Systems, 2023Knowledge distillation~(KD) has proven to be a highly effective approach for enhancing model performance through a teacher-student training scheme. However, most existing distillation methods are designed under the assumption that the teacher and student
Zhiwei Hao +6 more
semanticscholar +1 more source
International Conference on Learning Representations, 2023
Score Distillation Sampling (SDS) has emerged as the de facto approach for text-to-content generation in non-image domains. In this paper, we reexamine the SDS process and introduce a straightforward interpretation that demystifies the necessity for ...
Oren Katzir +3 more
semanticscholar +1 more source
Score Distillation Sampling (SDS) has emerged as the de facto approach for text-to-content generation in non-image domains. In this paper, we reexamine the SDS process and introduce a straightforward interpretation that demystifies the necessity for ...
Oren Katzir +3 more
semanticscholar +1 more source
Improved Distribution Matching Distillation for Fast Image Synthesis
Neural Information Processing SystemsRecent approaches have shown promises distilling diffusion models into efficient one-step generators. Among them, Distribution Matching Distillation (DMD) produces one-step generators that match their teacher in distribution, without enforcing a one-to ...
Tianwei Yin +6 more
semanticscholar +1 more source
SDXL-Lightning: Progressive Adversarial Diffusion Distillation
arXiv.orgWe propose a diffusion distillation method that achieves new state-of-the-art in one-step/few-step 1024px text-to-image generation based on SDXL. Our method combines progressive and adversarial distillation to achieve a balance between quality and mode ...
Shanchuan Lin, Anran Wang, Xiao Yang
semanticscholar +1 more source
A Survey on Knowledge Distillation of Large Language Models
arXiv.orgIn the era of Large Language Models (LLMs), Knowledge Distillation (KD) emerges as a pivotal methodology for transferring advanced capabilities from leading proprietary LLMs, such as GPT-4, to their open-source counterparts like LLaMA and Mistral ...
Xiaohan Xu +8 more
semanticscholar +1 more source
Distillation Processes and Distillates
2017Distillation is the art and science of separating alcohol and its accompanying volatiles from its fermented base, purifying, and, ultimately, refining the spirit fraction for its intended use. The art and science of distillation is based on a combination of skills and experience, and an understanding of the processes that occur within a distillation ...
Frank Vriesekoop, Dawid Ostrowski
openaire +2 more sources
International Conference on Machine Learning
We introduce Score identity Distillation (SiD), an innovative data-free method that distills the generative capabilities of pretrained diffusion models into a single-step generator.
Mingyuan Zhou +4 more
semanticscholar +1 more source
We introduce Score identity Distillation (SiD), an innovative data-free method that distills the generative capabilities of pretrained diffusion models into a single-step generator.
Mingyuan Zhou +4 more
semanticscholar +1 more source
Multicomponent batch distillation with distillate receiver [PDF]
Based on the assumption of adiabatic equilibrium stages, a rigorous calculation procedure applicable to a multicomponent batch distillation with a distillate receiver under total reflux condition was developed. Provided that the operating conditions including the desired product purity of the most volatile component in the receiver are specified, the ...
Jeung Kun Kim, Dong Pyo Ju
openaire +1 more source

