Results 71 to 80 of about 141,062 (268)
Knowledge Distillation Meets Self-supervision [PDF]
To appear in ECCV 2020.
Guodong Xu +3 more
openaire +2 more sources
This study presents a new hole transporting material (HTM) mechanism for self‐assembled monolayers in near‐infrared organic photodetectors. The formation of zwitterions induces a strong electric field that significantly increases the work function of HTM‐coated indium tin oxide substrates. The devices exhibit low dark current and noise, along with high
Jiyoung Shin +9 more
wiley +1 more source
Seismic interpretation is a crucial task in geophysics, requiring accurate prediction of subsurface layer thickness and seismic wave velocity. Traditional methods are computationally intensive and often hindered by noise in seismic data.
Amir Moslemi +4 more
doaj +1 more source
Named Entity Recognition Model Based on k-best Viterbi Decoupling Knowledge Distillation [PDF]
Knowledge distillation is a general approach to improve the performance of the named entity recognition (NER) models. However, the classical knowledge distillation loss functions are coupled, which leads to poor logit distillation.
ZHAO Honglei, TANG Huanling, ZHANG Yu, SUN Xueyuan, LU Mingyu
doaj +1 more source
Student-friendly knowledge distillation
In knowledge distillation, the knowledge from the teacher model is often too complex for the student model to thoroughly process. However, good teachers in real life always simplify complex material before teaching it to students. Inspired by this fact, we propose student-friendly knowledge distillation (SKD) to simplify teacher output into new ...
Yuan, Mengyang, Lang, Bo, Quan, Fengnan
openaire +2 more sources
Materials exist that are useful for gamma scintillation, radiation shielding, neutron‐gamma pulse shape discrimination (PSD), thermal neutron detection, or high refractive index applications. While certain materials have exhibited optimal performance for each of these applications, none achieve multiple functions.
Isabelle Winardi +13 more
wiley +1 more source
Heterogeneous Knowledge Distillation Using Conceptual Learning
Recent advances in deep learning have led to the development of large, high-performing models that have been pretrained on massive datasets. However, employing these models in real-world services requires fast inference speed and low computational ...
Yerin Yu, Namgyu Kim
doaj +1 more source
In this study, we produced HfN‐based nanoparticles via femtosecond laser ablation in acetone. The nanoparticles exhibit a red‐shifted plasmonic resonance in the NIR‐I window, colloidal stability after coating with polyethyleneglycol, and excellent biocompatibility. The photothermal and X‐ray sensitization therapeutic effects were demonstrated for tumor
Julia S. Babkova +15 more
wiley +1 more source
Distillating knowledge about SCOTCH
The design of the Scotch library for static mapping, graph partitioning and sparse matrix ordering is highly modular, so as to allow users and potential contributors to tweak it and add easily new static mapping, graph bipartitioning, vertex separation or graph ordering methods to match their particular needs.
openaire +4 more sources
ABSTRACT Breathable membranes that reject chemical warfare agents (CWAs) are required for next‐generation protective apparel. A dual‐function graphene oxide (GO)‐polyamine architecture is introduced that addresses the long‐standing tradeoff between vapor transmission and CWA selectivity.
Hyungjun Kim +6 more
wiley +1 more source

