Results 231 to 240 of about 141,062 (268)
Knowledge Distillation with Geometry-Consistent Feature Alignment for Robust Low-Light Apple Detection. [PDF]
Shi Y, Ma Y, Geng L, Chu L, Li B, Li W.
europepmc +1 more source
What Can K–12 Education Teach College Professors?
The Bulletin of the Ecological Society of America, EarlyView.
Michael P. Marchetti
wiley +1 more source
From Research to Action: Communicating Science Effectively for Real‐World Impact
The Bulletin of the Ecological Society of America, EarlyView.
Luis Y. Santiago‐Rosario +6 more
wiley +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Frameless Graph Knowledge Distillation
IEEE Transactions on Neural Networks and Learning SystemsKnowledge distillation (KD) has shown great potential for transferring knowledge from a complex teacher model to a simple student model in which the heavy learning task can be accomplished efficiently and without losing too much prediction accuracy. Recently, many attempts have been made by applying the KD mechanism to graph representation learning ...
Dai Shi +4 more
openaire +2 more sources
Explainability-based knowledge distillation
Pattern Recognition, 2023Tianli Sun +3 more
openaire +1 more source
Weighted Knowledge Based Knowledge Distillation
The transactions of The Korean Institute of Electrical Engineers, 2022Sungjae Kang, Kisung Seo
openaire +1 more source
Neighbor Self-Knowledge Distillation
Information Sciences, 2023Peng Liang +3 more
openaire +1 more source
Integrative oncology: Addressing the global challenges of cancer prevention and treatment
Ca-A Cancer Journal for Clinicians, 2022Jun J Mao,, Msce +2 more
exaly
Neurosymbolic Knowledge Distillation
2023 IEEE International Conference on Big Data (BigData), 2023Himel Das Gupta, Victor S. Sheng
openaire +1 more source

