Results 291 to 300 of about 219,795 (335)
Some of the next articles are maybe not open access.
SDXL-Lightning: Progressive Adversarial Diffusion Distillation
arXiv.orgWe propose a diffusion distillation method that achieves new state-of-the-art in one-step/few-step 1024px text-to-image generation based on SDXL. Our method combines progressive and adversarial distillation to achieve a balance between quality and mode ...
Shanchuan Lin, Anran Wang, Xiao Yang
semanticscholar +1 more source
A Survey on Knowledge Distillation of Large Language Models
arXiv.orgIn the era of Large Language Models (LLMs), Knowledge Distillation (KD) emerges as a pivotal methodology for transferring advanced capabilities from leading proprietary LLMs, such as GPT-4, to their open-source counterparts like LLaMA and Mistral ...
Xiaohan Xu +8 more
semanticscholar +1 more source
Membrane Distillation and Osmotic Distillation
2010.
Curcio E, Di Profio G, Drioli E
openaire +4 more sources
International Conference on Machine Learning
We introduce Score identity Distillation (SiD), an innovative data-free method that distills the generative capabilities of pretrained diffusion models into a single-step generator.
Mingyuan Zhou +4 more
semanticscholar +1 more source
We introduce Score identity Distillation (SiD), an innovative data-free method that distills the generative capabilities of pretrained diffusion models into a single-step generator.
Mingyuan Zhou +4 more
semanticscholar +1 more source
Distillation Processes and Distillates
2017Distillation is the art and science of separating alcohol and its accompanying volatiles from its fermented base, purifying, and, ultimately, refining the spirit fraction for its intended use. The art and science of distillation is based on a combination of skills and experience, and an understanding of the processes that occur within a distillation ...
Frank Vriesekoop, Dawid Ostrowski
openaire +1 more source
Experimental demonstration of logical magic state distillation
NatureRealizing universal fault-tolerant quantum computation is a key goal in quantum information science1, 2, 3–4. By encoding quantum information into logical qubits using quantum error correcting codes, physical errors can be detected and corrected ...
Pedro Sales Rodriguez +72 more
semanticscholar +1 more source
D4M: Dataset Distillation via Disentangled Diffusion Model
Computer Vision and Pattern RecognitionDataset distillation offers a lightweight synthetic dataset for fast network training with promising test accuracy. To imitate the performance of the original dataset, most approaches employ bi-level optimization and the distillation space relies on the ...
Duo Su +4 more
semanticscholar +1 more source
Reciprocal Teacher-Student Learning via Forward and Feedback Knowledge Distillation
IEEE transactions on multimediaKnowledge distillation (KD) is a prevalent model compression technique in deep learning, aiming to leverage knowledge from a large teacher model to enhance the training of a smaller student model.
Jianping Gou +6 more
semanticscholar +1 more source
Journal of the American College of Radiology, 2020
Sharon W, Kwan, Christoph I, Lee
openaire +2 more sources
Sharon W, Kwan, Christoph I, Lee
openaire +2 more sources

