A Study on Different Deep Learning Algorithms Used in Deep Neural Nets: MLP SOM and DBN. [PDF]
Deep learning is a wildly popular topic in machine learning and is structured as a series of nonlinear layers that learns various levels of data representations.
Naskath J, Sivakamasundari G, Begum AAS.
europepmc +2 more sources
A Review on Explainability in Multimodal Deep Neural Nets [PDF]
Artificial Intelligence techniques powered by deep neural nets have achieved much success in several application domains, most significantly and notably in the Computer Vision applications and Natural Language Processing tasks.
Gargi Joshi +2 more
doaj +2 more sources
Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a Polynomial Net Study [PDF]
Neural tangent kernel (NTK) is a powerful tool to analyze training dynamics of neural networks and their generalization bounds. The study on NTK has been devoted to typical neural network architectures, but it is incomplete for neural networks with ...
Yongtao Wu +4 more
openalex +3 more sources
Convolutional nets for reconstructing neural circuits from brain images acquired by serial section electron microscopy [PDF]
Neural circuits can be reconstructed from brain images acquired by serial section electron microscopy. Image analysis has been performed by manual labor for half a century, and efforts at automation date back almost as far.
Lee, Kisuk +5 more
core +2 more sources
Modeling Interval Timing by Recurrent Neural Nets [PDF]
The purpose of this study was to take a new approach in showing how the central nervous system might encode time at the supra-second level using recurrent neural nets (RNNs).
Theodore Raphan +5 more
doaj +2 more sources
Predictive markers for Parkinson's disease using deep neural nets on neuromelanin sensitive MRI. [PDF]
Shinde S +6 more
europepmc +2 more sources
When Do Neural Nets Outperform Boosted Trees on Tabular Data? [PDF]
Tabular data is one of the most commonly used types of data in machine learning. Despite recent advances in neural nets (NNs) for tabular data, there is still an active discussion on whether or not NNs generally outperform gradient-boosted decision trees
Duncan C. McElfresh +8 more
semanticscholar +1 more source
The Geometry of Neural Nets' Parameter Spaces Under Reparametrization [PDF]
Model reparametrization, which follows the change-of-variable rule of calculus, is a popular way to improve the training of neural nets. But it can also be problematic since it can induce inconsistencies in, e.g., Hessian-based flatness measures ...
Agustinus Kristiadi +2 more
semanticscholar +1 more source
FPGA Implementation of Neural Nets [PDF]
The field programmable gate array (FPGA) is used to build an artificial neural network in hardware. Architecture for a digital system is devised to execute a feed-forward multilayer neural network.
B A Sujatha Kumari +2 more
doaj +1 more source
Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules [PDF]
Neural ordinary differential equations (ODEs) have attracted much attention as continuous-time counterparts of deep residual neural networks (NNs), and numerous extensions for recurrent NNs have been proposed. Since the 1980s, ODEs have also been used to
Kazuki Irie +2 more
semanticscholar +1 more source

