Results 101 to 110 of about 79,498 (272)
Fast Convex Pruning of Deep Neural Networks
We develop a fast, tractable technique called Net-Trim for simplifying a trained neural network. The method is a convex post-processing module, which prunes (sparsifies) a trained network layer by layer, while preserving the internal responses. We present a comprehensive analysis of Net-Trim from both the algorithmic and sample complexity standpoints ...
Alireza Aghasi +2 more
openaire +3 more sources
This perspective highlights how knowledge‐guided artificial intelligence can address key challenges in manufacturing inverse design, including high‐dimensional search spaces, limited data, and process constraints. It focused on three complementary pillars—expert‐guided problem definition, physics‐informed machine learning, and large language model ...
Hugon Lee +3 more
wiley +1 more source
A Channel Pruning Algorithm Based on Depth-Wise Separable Convolution Unit
Deep learning has made significant progress in many fields such as image identification, speech recognition and natural language processing, especially in the field of computer vision.
Ke Zhang +3 more
doaj +1 more source
Automating AI Discovery for Biomedicine Through Knowledge Graphs and Large Language Models Agents
This work proposes a novel framework that automates biomedical discovery by integrating knowledge graphs with multiagent large language models. A biologically aligned graph exploration strategy identifies hidden pathways between biomedical entities, and specialized agents use this pathway to iteratively design AI predictors and wet‐lab validation ...
Naafey Aamer +3 more
wiley +1 more source
Explaining the Origin of Negative Poisson's Ratio in Amorphous Networks With Machine Learning
This review summarizes how machine learning (ML) breaks the “vicious cycle” in designing auxetic amorphous networks. By transitioning from traditional “black‐box” optimization to an interpretable “AI‐Physics” closed‐loop paradigm, ML is shown to not only discover highly optimized structures—such as all‐convex polygon networks—but also unveil hidden ...
Shengyu Lu, Xiangying Shen
wiley +1 more source
INVESTIGATION OF DATA MINING USING PRUNED ARTIFICIAL NEURAL NETWORK TREE [PDF]
A major drawback associated with the use of artificial neural networks for data mining is their lack of explanation capability. While they can achieve a high predictive accuracy rate, the knowledge captured is not transparent and cannot be verified by ...
S.M.A.KALAIARASI +3 more
doaj
Pruning Neural Networks with Supermasks
Vincent Rolfs +2 more
openaire +1 more source
scTIGER2.0 is a deep‐learning framework that infers gene regulatory networks from single‐cell RNA sequencing data. By integrating correlation, pseudotime ordering, deep learning and bootstrap‐based significance testing, it reduces false positives and reveals directional gene interactions.
Nishi Gupta +3 more
wiley +1 more source
Calibration‐Free Electromyography Motor Intent Decoding Using Large‐Scale Supervised Pretraining
Calibration‐free electromyography motor intent decoding is enabled through large‐scale supervised pretraining across heterogeneous datasets. A Spatially Aware Feature‐learning Transformer processes variable channel counts and electrode geometries, allowing transfer across users and recording setups. On a held‐out benchmark, fine‐tuned cross‐user models
Alexander E. Olsson +3 more
wiley +1 more source
Multi-Class Classification of Medical Data Based on Neural Network Pruning and Information-Entropy Measures. [PDF]
Sánchez-Gutiérrez ME +1 more
europepmc +1 more source

