Results 121 to 130 of about 4,930,132 (374)

Self-Distillation Amplifies Regularization in Hilbert Space

open access: yes, 2020
Knowledge distillation introduced in the deep learning context is a method to transfer knowledge from one architecture to another. In particular, when the architectures are identical, this is called self-distillation. The idea is to feed in predictions of the trained model as new target values for retraining (and iterate this loop possibly a few times).
Mobahi, Hossein   +2 more
openaire   +2 more sources

Learning from All Sides: Diversified Positive Augmentation via Self-distillation in Recommendation

open access: yes, 2023
Personalized recommendation relies on user historical behaviors to provide user-interested items, and thus seriously struggles with the data sparsity issue.
Lin, Leyu   +5 more
core  

Investigation of Halogen Substitution Effects in π‐Conjugated Organic Ligands of Chiral Hybrid Perovskites on Their Chiroptical Activity

open access: yesAdvanced Functional Materials, EarlyView.
The role of novel thiophene‐based ligands with halogen substitutions in enhancing the chiroptical and optoelectronic properties of 2D chiral HOIPs has been investigated. By tailoring ligand design, enhanced CD and CPL properties are achieved, with improved CPL discrimination in photodetectors.
Boesung Kwon   +4 more
wiley   +1 more source

Timestamp-Guided Knowledge Distillation for Robust Sensor-Based Time-Series Forecasting

open access: yesSensors
Accurate time-series forecasting plays a vital role in sensor-driven applications such as energy monitoring, traffic flow prediction, and environmental sensing.
Jiahe Yan   +5 more
doaj   +1 more source

SDPose: Tokenized Pose Estimation via Circulation-Guide Self-Distillation [PDF]

open access: yesComputer Vision and Pattern Recognition
Recently, transformer-based methods have achieved state-of-the-art prediction quality on human pose estimation(HPE). Nonetheless, most of these top-performing transformer-based models are too computation-consuming and storage-demanding to deploy on edge ...
Sicheng Chen   +9 more
semanticscholar   +1 more source

Federated Learning on Heterogeneous Data via Adaptive Self-Distillation

open access: yes, 2023
Federated Learning (FL) is a machine learning paradigm that enables clients to jointly train a global model by aggregating the locally trained models without sharing any local training data. In practice, there can often be substantial heterogeneity (e.g.,
Chakraborty, Anirban   +4 more
core  

Coagulative Granular Hydrogels with an Enzyme Catalyzed Fibrin Network for Endogenous Tissue Regeneration

open access: yesAdvanced Healthcare Materials, EarlyView.
Coagulative granular hydrogels are composed of packed thrombin‐functionalized microgels that catalyze the conversion of fibrinogen into a secondary fibrin network, filling the interstitial voids. This bio‐inspired approach stabilizes the biomaterial to match the robustness of bulk hydrogels without compromising injectability, mimicking the initial ...
Zhipeng Deng   +16 more
wiley   +1 more source

Spherical Skin Model: Stratified Co‐Culture of Fibroblasts and Keratinocytes on Spherical Beads Toward Compound Screening

open access: yesAdvanced Healthcare Materials, EarlyView.
Models of the human skin must combine the relevant biological contents and suitable biomaterials with the correct spatial organization. Performing compound screening on such in vitro models also requires fast and reproducible production methods of the models.
Elisa Lenzi   +7 more
wiley   +1 more source

Noise robust distillation of self-supervised speech models via correlation metrics [PDF]

open access: green, 2023
Fabian Ritter-Gutierrez   +6 more
openalex   +1 more source

Home - About - Disclaimer - Privacy