Results 321 to 330 of about 417,636 (395)

A latent class model for discrete choice analysis: contrasts with mixed logit

Transportation Research Part B: Methodological, 2003
W. Greene, D. Hensher
semanticscholar   +3 more sources

Multi-Level Logit Distillation

Computer Vision and Pattern Recognition, 2023
Knowledge Distillation (KD) aims at distilling the knowledge from the large teacher model to a lightweight student model. Mainstream KD methods can be divided into two categories, logit distillation, and feature distillation.
Ying Jin, Jiaqi Wang, Dahua Lin
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy