Results 91 to 100 of about 14,848 (289)
Cascaded channel pruning using hierarchical self-distillation
In this paper, we propose an approach for filter-level pruning with hierarchical knowledge distillation based on the teacher, teaching-assistant, and student framework. Our method makes use of teaching assistants at intermediate pruning levels that share the same architecture and weights as the target student.
Miles, Roy, Mikolajczyk, Krystian
openaire +2 more sources
Pressure skin wounds are frequent complications after spinal cord injury (SCI), with impaired healing due to vascular and immune deficits. Elastin‐like polypeptides (ELP) fused to α‐MSH (MSH‐ELP) or MCP‐1 (MCP‐ELP) are developed and tested on these wounds. The resulting nanoparticles are non‐toxic and bioactive, and they enhance macrophage recruitment,
Suneel Kumar +7 more
wiley +1 more source
Learning from All Sides: Diversified Positive Augmentation via Self-distillation in Recommendation
Personalized recommendation relies on user historical behaviors to provide user-interested items, and thus seriously struggles with the data sparsity issue.
Lin, Leyu +5 more
core
Self-Distillation Amplifies Regularization in Hilbert Space
Knowledge distillation introduced in the deep learning context is a method to transfer knowledge from one architecture to another. In particular, when the architectures are identical, this is called self-distillation. The idea is to feed in predictions of the trained model as new target values for retraining (and iterate this loop possibly a few times).
Mobahi, Hossein +2 more
openaire +2 more sources
Machine learning–guided engineering of a plectasin‐derived peptide yields DC05, a potent antimycobacterial candidate. Encapsulation into tuftsin‐functionalized mesoporous silica nanoparticles enhances intracellular delivery, stability, and activity against Mycobacterium tuberculosis while maintaining low cytotoxicity and minimal hemolysis. The combined
Christian S. Carnero Canales +12 more
wiley +1 more source
Organelle localization‐induced biorthogonal polymerization enables direct synthesis of photostable poly‐AIEgens within targeted organelles for super‐resolution live‐cell imaging. ABSTRACT Real‐time monitoring of dynamic biological processes demands fluorescent probes that can withstand prolonged light exposure without photobleaching—a critical ...
Gaeun Park +4 more
wiley +1 more source
Semantic Super-Resolution via Self-Distillation and Adversarial Learning
Semantic super-resolution (SR) is an approach that improves the SR performance by leveraging semantic information about the scene. This study develops a novel semantic SR method that is based on the generative adversarial network (GAN) framework and self-
Hanhoon Park
doaj +1 more source
Federated Learning on Heterogeneous Data via Adaptive Self-Distillation
Federated Learning (FL) is a machine learning paradigm that enables clients to jointly train a global model by aggregating the locally trained models without sharing any local training data. In practice, there can often be substantial heterogeneity (e.g.,
Chakraborty, Anirban +4 more
core
Implantable optoelectrical devices are an effective resource for the modulation and monitoring of neural activity with high spatiotemporal resolution. This review discusses current challenges faced by these devices and outlines future perspectives for the development of next‐generation neural interfaces targeting chronic, multisite, and multimodal ...
Stella Aslanoglou +4 more
wiley +1 more source
FedRAS: a dual-strategy framework for federated learning on heterogeneous devices
Federated learning enables multiple clients to collaboratively train a global model using their respective local datasets. While it offers advantages such as privacy preservation and efficient data utilization, its practical deployment is still ...
Mohan Xu, Lena Wiese
doaj +1 more source

