Results 1 to 10 of about 205,061 (335)

The Distillation of Alcohol. [PDF]

open access: greenJournal of Industrial & Engineering Chemistry, 1912
n ...
A. B. Adams
openalex   +4 more sources

Distillation of Peppermint [PDF]

open access: greenScientific American, 1888
n ...
Albert M. Todd
openalex   +3 more sources

ProlificDreamer: High-Fidelity and Diverse Text-to-3D Generation with Variational Score Distillation [PDF]

open access: yesNeural Information Processing Systems, 2023
Score distillation sampling (SDS) has shown great promise in text-to-3D generation by distilling pretrained large-scale text-to-image diffusion models, but suffers from over-saturation, over-smoothing, and low-diversity problems. In this work, we propose
Zhengyi Wang   +6 more
semanticscholar   +1 more source

One-Step Diffusion with Distribution Matching Distillation [PDF]

open access: yesComputer Vision and Pattern Recognition, 2023
Diffusion models generate high-quality images but require dozens of forward passes. We introduce Distribution Matching Distillation (DMD), a procedure to transform a diffusion model into a one-step image generator with minimal impact on image quality. We
Tianwei Yin   +6 more
semanticscholar   +1 more source

Effects of Different Base Wine Volumes on Volatile Compound Profile of Distilled Merlot Wine Analyzed by Headspace Solid-Phase Microextraction Combined with Gas Chromatography-Mass Spectrometry [PDF]

open access: yesShipin Kexue, 2023
The effects of different base wine volumes (150, 200, and 250 L) during the distillation process on the volatile compounds of distilled wine made from Merlot grapes from the Jiaodong Peninsula were investigated in order to optimize the distillation ...
GUO Yayun, SHAO Xuedong, ZHANG Zhengwen, LI Bo, XUE Wei, SHI Hongmei
doaj   +1 more source

Knowledge Distillation: A Survey [PDF]

open access: yesInternational Journal of Computer Vision, 2020
In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its scalability to encode large-scale data and to maneuver billions of ...
Jianping Gou, B. Yu, S. Maybank, D. Tao
semanticscholar   +1 more source

Anomaly Detection via Reverse Distillation from One-Class Embedding [PDF]

open access: yesComputer Vision and Pattern Recognition, 2022
Knowledge distillation (KD) achieves promising results on the challenging problem of unsupervised anomaly detection (AD). The representation discrepancy of anomalies in the teacher-student (T-S) model provides essential evidence for AD.
Hanqiu Deng, Xingyu Li
semanticscholar   +1 more source

Dataset Distillation by Matching Training Trajectories [PDF]

open access: yesComputer Vision and Pattern Recognition, 2022
Dataset distillation is the task of synthesizing a small dataset such that a model trained on the synthetic set will match the test accuracy of the model trained on the full dataset.
George Cazenavette   +4 more
semanticscholar   +1 more source

Effective Whole-body Pose Estimation with Two-stages Distillation [PDF]

open access: yes2023 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), 2023
Whole-body pose estimation localizes the human body, hand, face, and foot keypoints in an image. This task is challenging due to multi-scale body parts, fine-grained localization for low-resolution regions, and data scarcity. Meanwhile, applying a highly
Zhendong Yang   +3 more
semanticscholar   +1 more source

Decoupled Knowledge Distillation [PDF]

open access: yesComputer Vision and Pattern Recognition, 2022
State-of-the-art distillation methods are mainly based on distilling deep features from intermediate layers, while the significance of logit distillation is greatly overlooked. To provide a novel viewpoint to study logit distillation, we re-formulate the
Borui Zhao   +4 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy