Results 291 to 300 of about 205,061 (335)

Domain-incremental white blood cell classification with privacy-aware continual learning. [PDF]

open access: yesSci Rep
Kumari P   +6 more
europepmc   +1 more source

Adversarial Diffusion Distillation

European Conference on Computer Vision, 2023
We introduce Adversarial Diffusion Distillation (ADD), a novel training approach that efficiently samples large-scale foundational image diffusion models in just 1-4 steps while maintaining high image quality.
Axel Sauer   +3 more
semanticscholar   +1 more source

Zephyr: Direct Distillation of LM Alignment

arXiv.org, 2023
We aim to produce a smaller language model that is aligned to user intent. Previous research has shown that applying distilled supervised fine-tuning (dSFT) on larger models significantly improves task accuracy; however, these models are unaligned, i.e ...
Lewis Tunstall   +13 more
semanticscholar   +1 more source

Membrane distillation

Special Distillation Processes, 2022
Studies in the field of membrane distillation are analysed. A critical analysis of the theoretical and experimental investigations of membrane distillation is presented.
Zhongwei Ding   +3 more
semanticscholar   +1 more source

Revisiting Reverse Distillation for Anomaly Detection

Computer Vision and Pattern Recognition, 2023
Anomaly detection is an important application in large-scale industrial manufacturing. Recent methods for this task have demonstrated excellent accuracy but come with a latency trade-off.
Tran Dinh Tien   +7 more
semanticscholar   +1 more source

Text-to-3D with Classifier Score Distillation

International Conference on Learning Representations, 2023
Text-to-3D generation has made remarkable progress recently, particularly with methods based on Score Distillation Sampling (SDS) that leverages pre-trained 2D diffusion models.
Xin Yu   +5 more
semanticscholar   +1 more source

DREAM: Efficient Dataset Distillation by Representative Matching

IEEE International Conference on Computer Vision, 2023
Dataset distillation aims to synthesize small datasets with little information loss from original large-scale ones for reducing storage and training costs.
Yanqing Liu   +5 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy