Results 281 to 290 of about 219,795 (335)
Reinforcement learning for LLM-based explainable TCM prescription recommendation with implicit preferences from small language models. [PDF]
Wang X +6 more
europepmc +1 more source
Simulation and Control of an Aromatic Distillation Column
Ali Valadkhani, Mohammad Shahrokhi
openalex +1 more source
The shortcut design of distillation intermediate heat exchangers
George Charles Perreault
openalex +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Adversarial Diffusion Distillation
European Conference on Computer Vision, 2023We introduce Adversarial Diffusion Distillation (ADD), a novel training approach that efficiently samples large-scale foundational image diffusion models in just 1-4 steps while maintaining high image quality.
Axel Sauer +3 more
semanticscholar +1 more source
Revisiting Reverse Distillation for Anomaly Detection
Computer Vision and Pattern Recognition, 2023Anomaly detection is an important application in large-scale industrial manufacturing. Recent methods for this task have demonstrated excellent accuracy but come with a latency trade-off.
Tran Dinh Tien +7 more
semanticscholar +1 more source
Zephyr: Direct Distillation of LM Alignment
arXiv.org, 2023We aim to produce a smaller language model that is aligned to user intent. Previous research has shown that applying distilled supervised fine-tuning (dSFT) on larger models significantly improves task accuracy; however, these models are unaligned, i.e ...
Lewis Tunstall +13 more
semanticscholar +1 more source
Special Distillation Processes, 2022
Studies in the field of membrane distillation are analysed. A critical analysis of the theoretical and experimental investigations of membrane distillation is presented.
Zhongwei Ding +3 more
semanticscholar +1 more source
Studies in the field of membrane distillation are analysed. A critical analysis of the theoretical and experimental investigations of membrane distillation is presented.
Zhongwei Ding +3 more
semanticscholar +1 more source
Improved Distribution Matching Distillation for Fast Image Synthesis
Neural Information Processing SystemsRecent approaches have shown promises distilling diffusion models into efficient one-step generators. Among them, Distribution Matching Distillation (DMD) produces one-step generators that match their teacher in distribution, without enforcing a one-to ...
Tianwei Yin +6 more
semanticscholar +1 more source

