Results 11 to 20 of about 219,795 (335)
Knowledge Distillation from A Stronger Teacher [PDF]
Unlike existing knowledge distillation methods focus on the baseline settings, where the teacher models and training strategies are not that strong and competing as state-of-the-art approaches, this paper presents a method dubbed DIST to distill better ...
Tao Huang +4 more
semanticscholar +1 more source
Ionic liquids (ILs) have shown excellent performance in the separation of binary azeotropes through extractive distillation [1]. But the role of the ionic liquid in azeotropic system is not well understood.
Hong Li +7 more
doaj +1 more source
Effective Whole-body Pose Estimation with Two-stages Distillation [PDF]
Whole-body pose estimation localizes the human body, hand, face, and foot keypoints in an image. This task is challenging due to multi-scale body parts, fine-grained localization for low-resolution regions, and data scarcity. Meanwhile, applying a highly
Zhendong Yang +3 more
semanticscholar +1 more source
Surfactants adsorption onto carbonate reservoirs would cause surfactants concentrations decrease in surfactant flooding, which would decrease surfactant efficiency in practical applications of enhanced oil recovery (EOR) processes.
Jinjian Hou +8 more
doaj +1 more source
Machine learning-based ionic liquids design and process simulation for CO2 separation from flue gas
Rational design of ionic liquids (ILs), which is highly dependent on the accuracy of the model used, has always been crucial for CO2 separation from flue gas.
Kai Wang, Huijin Xu, Chen Yang, Ting Qiu
doaj +1 more source
Application of the Chemical-Looping Concept for Azoetrope Separation
The need for the separation of azeotropic mixtures for the production of high-end chemicals and resource recovery has spurred significant research into the development of new separation methods in the chemical industry.
Xin Gao, Xueli Geng
doaj +1 more source
Masked Generative Distillation [PDF]
Knowledge distillation has been applied to various tasks successfully. The current distillation algorithm usually improves students' performance by imitating the output of the teacher.
Zhendong Yang +5 more
semanticscholar +1 more source
Performance enhancement of a single slope solar still with single basin using Fresnel lens
This paper presents an experimental study using Fresnel lens to increase the overall efficiency of the conventional Single Slope Solar Still (SSSS) with single basin.
Parimal Sharad Bhambare +2 more
doaj +1 more source
Linking the representation levels to a physical separation and purification method in chemistry: Understanding of distillation experiment [PDF]
The present investigation aims to analyze the knowledge levels and chemical representation levels of pre-service primary school teachers (PPSTs) on the experiment of distillation.
Nalan Akkuzu Güven, Melis Arzu Uyulgan
doaj +1 more source
Distillation as a Defense to Adversarial Perturbations Against Deep Neural Networks [PDF]
Deep learning algorithms have been shown to perform extremely well on many classical machine learning problems. However, recent studies have shown that deep learning, like other machine learning techniques, is vulnerable to adversarial samples: inputs ...
Nicolas Papernot +4 more
semanticscholar +1 more source

