Results 41 to 50 of about 142,324 (226)
Learning from Noisy Labels with Distillation
The ability of learning from noisy labels is very useful in many visual recognition tasks, as a vast amount of data with noisy labels are relatively easy to obtain. Traditionally, the label noises have been treated as statistical outliers, and approaches
Cao, Liangliang +5 more
core +1 more source
This study uncovers the unexplored role of intermolecular interactions in multiphoton absorption in coordination polymers. By analyzing [Zn2tpda(DMA)2(DMF)0.3], it shows how the electronic coupling of the chromophores and confinement in the MOF enhance two‐and three‐photon absorption.
Simon Nicolas Deger +11 more
wiley +1 more source
Unleashing the Power of Machine Learning in Nanomedicine Formulation Development
A random forest machine learning model is able to make predictions on nanoparticle attributes of different nanomedicines (i.e. lipid nanoparticles, liposomes, or PLGA nanoparticles) based on microfluidic formulation parameters. Machine learning models are based on a database of nanoparticle formulations, and models are able to generate unique solutions
Thomas L. Moore +7 more
wiley +1 more source
NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation
Most logit-based knowledge distillation methods transfer soft labels from the teacher model to the student model via Kullback–Leibler divergence based on softmax, an exponential normalization function. However, this exponential nature of softmax tends to
Chuan Li, Xiao Teng, Yan Ding, Long Lan
doaj +1 more source
Transformer-Based Knowledge Distillation with Ghost Attention for Multimodal Edge-Based Smart Surveillance [PDF]
In the modern era, knowledge distillation has gained attention as an important technique for edge-based smart surveillance that integrates accurate yet lightweight deployable models on resource-constrained devices.
Sataar Zahrah
doaj +1 more source
Chemoselective Sequential Polymerization: An Approach Toward Mixed Plastic Waste Recycling
Inspired by biological protein metabolism, this study demonstrates the closed‐loop recycling of mixed synthetic polymers via ring‐closing depolymerization followed by a chemoselective sequential polymerizations process. The approach recovers pure polymers from mixed feedstocks, even in multilayer formats, highlighting a promising strategy to overcome a
Gadi Slor +5 more
wiley +1 more source
Knowledge distillation of face recognition via attention cosine similarity review
Deep learning‐based face recognition models have demonstrated remarkable performance in benchmark tests, and knowledge distillation technology has been frequently accustomed to obtain high‐precision real‐time face recognition models specifically designed
Zhuo Wang, SuWen Zhao, WanYi Guo
doaj +1 more source
Distilling Diverse Knowledge for Deep Ensemble Learning
Bidirectional knowledge distillation improves network performance by sharing knowledge between networks during the training of multiple networks. Additionally, performance is further improved by using an ensemble of multiple networks during inference ...
Naoki Okamoto +3 more
doaj +1 more source
Named Entity Recognition Model Based on k-best Viterbi Decoupling Knowledge Distillation [PDF]
Knowledge distillation is a general approach to improve the performance of the named entity recognition (NER) models. However, the classical knowledge distillation loss functions are coupled, which leads to poor logit distillation.
ZHAO Honglei, TANG Huanling, ZHANG Yu, SUN Xueyuan, LU Mingyu
doaj +1 more source
SI‐bioATRP in Mesoporous Silica for Size‐Exclusion Driven Local Polymer Placement
An enzyme‐catalyzed surface‐initiated atom transfer radical polymerization (SI‐bioATRP) of an anionic monomer within mesoporous silica particles, using hemoglobin as a catalyst, allows for controlling the location of the formed polymer via size‐exclusion effects between the nanopores and the biomacromolecules, thereby opening routes to functional ...
Oleksandr Wondra +8 more
wiley +1 more source

