Results 61 to 70 of about 79,498 (272)

PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

open access: yes, 2018
This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be ...
Lazebnik, Svetlana, Mallya, Arun
core   +1 more source

To Prune or not to Prune: A Chaos-Causality Approach to Principled Pruning of Dense Neural Networks

open access: yesCoRR, 2023
Reducing the size of a neural network (pruning) by removing weights without impacting its performance is an important problem for resource-constrained devices. In the past, pruning was typically accomplished by ranking or penalizing weights based on criteria like magnitude and removing low-ranked weights before retraining the remaining ones.
Rajan Sahu   +4 more
openaire   +2 more sources

Leaftronics: Bio‐Fractal Scaffolds From Leaf Venation for Low‐Waste Electronics

open access: yesAdvanced Materials, EarlyView.
“Leaftronics” transforms naturally evolved leaf venation into quasi‐fractal scaffolds for sustainable electronics. Polymer‐infiltrated leaf skeletons can be used to fabricate ultra‐smooth, reflow‐ and thin‐film‐compatible decomposable substrates, while making the same lignocellulose networks conducting results in flexible transparent electrodes.
Rakesh Rajendran Nair   +3 more
wiley   +1 more source

Roulette: A Pruning Framework to Train a Sparse Neural Network From Scratch

open access: yesIEEE Access, 2021
Due to space and inference time restrictions, finding an efficient and sparse sub-network from a dense and over-parameterized network is critical for deploying neural networks on edge devices.
Qiaoling Zhong   +3 more
doaj   +1 more source

RadiX-Net: Structured Sparse Matrices for Deep Neural Networks

open access: yes, 2019
The sizes of deep neural networks (DNNs) are rapidly outgrowing the capacity of hardware to store and train them. Research over the past few decades has explored the prospect of sparsifying DNNs before, during, and after training by pruning edges from ...
Kepner, Jeremy, Robinett, Ryan A.
core   +1 more source

Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm

open access: yesAdvances in Neural Information Processing Systems 35, 2022
NeurIPS ...
Aidan Good   +7 more
openaire   +3 more sources

Vision‐Augmented Wearable Interfaces: Bioinspired Approaches for Realistic AI‐Human‐Machine Interaction

open access: yesAdvanced Materials Technologies, EarlyView.
This review presents recent progress in vision‐augmented wearable interfaces that combine artificial vision, soft wearable sensors, and exoskeletal robots. Inspired by biological visual systems, these technologies enable multimodal perception and intelligent human–machine interaction.
Jihun Lee   +4 more
wiley   +1 more source

Structure Optimization in Deep Neural Networks with Synaptic Pruning Based on Connection Appraisal [PDF]

open access: yesComputer and Knowledge Engineering
Deep neural networks typically require predefined architectures, which can lead to overfitting, underfitting, high computational costs, and storage overhead.
Aghil Ahmadi, Reza Mahboobi Esfanjani
doaj   +1 more source

Complexity of Deep Convolutional Neural Networks in Mobile Computing

open access: yesComplexity, 2020
Neural networks employ massive interconnection of simple computing units called neurons to compute the problems that are highly nonlinear and could not be hard coded into a program.
Saad Naeem   +3 more
doaj   +1 more source

Improving Deep Echo State Network with Neuronal Similarity-Based Iterative Pruning Merging Algorithm

open access: yesApplied Sciences, 2023
Recently, a layer-stacked ESN model named deep echo state network (DeepESN) has been established. As an interactional model of a recurrent neural network and deep neural network, investigations of DeepESN are of significant importance in both areas ...
Qingyu Shen, Hanwen Zhang, Yao Mao
doaj   +1 more source

Home - About - Disclaimer - Privacy