A Steerable and Expandable Magnetic Aspiration Catheter for Enhanced Aspiration Thrombectomy
This article presents a novel magnetic aspiration catheter (MAC) that can be both actively steered and undergo distal‐end expansion via externally applied magnetic fields. The MAC facilitates precise branch selection and enhances clot engagement by improving endovascular navigation and aspiration efficiency. Phantom experiments validate its feasibility
Hakjoon Lee +7 more
wiley +1 more source
Enhancing weed detection through knowledge distillation and attention mechanism. [PDF]
El Alaoui A, Mousannif H.
europepmc +1 more source
Droplet‐based microfluidics enables precise, high‐throughput microscale reactions but continues to face challenges in scalability, reproducibility, and data complexity. This review examines how artificial intelligence enhances droplet generation, detection, sorting, and adaptive control and discusses emerging opportunities for clinical and industrial ...
Junyan Lai +10 more
wiley +1 more source
Efficient and Accurate Epilepsy Seizure Prediction and Detection Based on Multi-Teacher Knowledge Distillation RGF-Model. [PDF]
Cao W, Li Q, Zhang A, Wang T.
europepmc +1 more source
EMG‐Driven Telemetry and Inference System for Fish: Pose Reconstruction and Flow Sensing
This work introduces an electromyography (EMG)‐driven telemetry framework that reconstructs body pose and infers hydrodynamic conditions in freely swimming fish. A custom 16‐channel archival system records intramuscular EMG, enabling deep‐learning models to decode joint kinematics, classify flow regimes, and reveal channel‐efficient sensing strategies.
Rahdar Hussain Afridi +7 more
wiley +1 more source
A hybrid neighborhood enhanced contrastive learning and self-knowledge distillation method for scRNA-seq data clustering analysis. [PDF]
Qi L +5 more
europepmc +1 more source
KDLM: Lightweight Brain Tumor Segmentation via Knowledge Distillation
A lightweight student network is designed, which is based on multiscale and multilevel feature fusion and combined with the residual channel attention mechanism to achieve efficient feature extraction and fusion with very few parameters. A dual‐teacher collaborative knowledge distillation framework is proposed.
Baotian Li +4 more
wiley +1 more source
Efficient knowledge distillation and alignment for improved KB-VQA. [PDF]
Qin X, Pei R, He C, Li F, Zhang X.
europepmc +1 more source
Four decades of retinal vessel segmentation research (1982–2025) are synthesized, spanning classical image processing, machine learning, and deep learning paradigms. A meta‐analysis of 428 studies establishes a unified taxonomy and highlights performance trends, generalization capabilities, and clinical relevance.
Avinash Bansal +6 more
wiley +1 more source
Enhancing the Predictive Power of Macrocyclic Drug Permeability by Knowledge Distillation from Analogous Pretraining Data. [PDF]
Zhang Y, Pentikäinen OT.
europepmc +1 more source

