Results 1 to 10 of about 219,795 (335)
ProlificDreamer: High-Fidelity and Diverse Text-to-3D Generation with Variational Score Distillation [PDF]
Score distillation sampling (SDS) has shown great promise in text-to-3D generation by distilling pretrained large-scale text-to-image diffusion models, but suffers from over-saturation, over-smoothing, and low-diversity problems. In this work, we propose
Zhengyi Wang +6 more
semanticscholar +1 more source
One-Step Diffusion with Distribution Matching Distillation [PDF]
Diffusion models generate high-quality images but require dozens of forward passes. We introduce Distribution Matching Distillation (DMD), a procedure to transform a diffusion model into a one-step image generator with minimal impact on image quality. We
Tianwei Yin +6 more
semanticscholar +1 more source
Knowledge Distillation: A Survey [PDF]
In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its scalability to encode large-scale data and to maneuver billions of ...
Jianping Gou, B. Yu, S. Maybank, D. Tao
semanticscholar +1 more source
Effects of Different Base Wine Volumes on Volatile Compound Profile of Distilled Merlot Wine Analyzed by Headspace Solid-Phase Microextraction Combined with Gas Chromatography-Mass Spectrometry [PDF]
The effects of different base wine volumes (150, 200, and 250 L) during the distillation process on the volatile compounds of distilled wine made from Merlot grapes from the Jiaodong Peninsula were investigated in order to optimize the distillation ...
GUO Yayun, SHAO Xuedong, ZHANG Zhengwen, LI Bo, XUE Wei, SHI Hongmei
doaj +1 more source
Anomaly Detection via Reverse Distillation from One-Class Embedding [PDF]
Knowledge distillation (KD) achieves promising results on the challenging problem of unsupervised anomaly detection (AD). The representation discrepancy of anomalies in the teacher-student (T-S) model provides essential evidence for AD.
Hanqiu Deng, Xingyu Li
semanticscholar +1 more source
Dataset Distillation by Matching Training Trajectories [PDF]
Dataset distillation is the task of synthesizing a small dataset such that a model trained on the synthetic set will match the test accuracy of the model trained on the full dataset.
George Cazenavette +4 more
semanticscholar +1 more source
Decoupled Knowledge Distillation [PDF]
State-of-the-art distillation methods are mainly based on distilling deep features from intermediate layers, while the significance of logit distillation is greatly overlooked. To provide a novel viewpoint to study logit distillation, we re-formulate the
Borui Zhao +4 more
semanticscholar +1 more source
Biodiesel is an attractive alternative to fossil fuels due to the energy and environmental concerns. In this paper, seven different multi –SO3H functionalized ILs based on the low-cost less-substituted amines, which contained massive sites for ...
Meichen Li +5 more
doaj +1 more source
Since membrane separations, such as vapor permeation (VP) or pervaporation (PV), are not limited by vapor-liquid equilibrium, their coupling with distillation can exert a synergistic effect, overcome the thermodynamic limits of distillation, and improve ...
Shun Liu +4 more
doaj +1 more source
The development trend of Fischer–Tropsch (F–T) technology is to develop high value-added products. The separation of linear α-olefins with low cost is an effective method.
Zongchao Liu +8 more
doaj +1 more source

