Results 331 to 340 of about 2,238,851 (367)
Predicting troponin biomarker elevation from electrocardiograms using a deep neural network. [PDF]
Hilgendorf L+14 more
europepmc +1 more source
scGO: interpretable deep neural network for cell status annotation and disease diagnosis. [PDF]
Wu Y+8 more
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
An on-chip photonic deep neural network for image classification
Nature, 2021Deep neural networks with applications from computer vision to medical diagnosis1–5 are commonly implemented using clock-based processors6–14, in which computation speed is mainly limited by the clock frequency and the memory access time.
F. Ashtiani+2 more
semanticscholar +1 more source
2021
Quantitative structure-activity relationship (QSAR) models are routinely applied computational tools in the drug discovery process. QSAR models are regression or classification models that predict the biological activities of molecules based on the features derived from their molecular structures.
openaire +3 more sources
Quantitative structure-activity relationship (QSAR) models are routinely applied computational tools in the drug discovery process. QSAR models are regression or classification models that predict the biological activities of molecules based on the features derived from their molecular structures.
openaire +3 more sources
Deep Neural Network Embeddings for Text-Independent Speaker Verification
Interspeech, 2017This paper investigates replacing i-vectors for text-independent speaker verification with embeddings extracted from a feed-forward deep neural network.
David Snyder+3 more
semanticscholar +1 more source
2019
In Chap. 1, our empirical analysis was based on neural networks with a single hidden layer. These networks, called shallow, are in theory universal approximators of any continuous function. Deep neural networks use instead a cascade of multiple layers of hidden neurons. Each successive layer uses the output from the previous layer as input.
Michel Denuit+2 more
openaire +2 more sources
In Chap. 1, our empirical analysis was based on neural networks with a single hidden layer. These networks, called shallow, are in theory universal approximators of any continuous function. Deep neural networks use instead a cascade of multiple layers of hidden neurons. Each successive layer uses the output from the previous layer as input.
Michel Denuit+2 more
openaire +2 more sources
2014
In this chapter, we introduce deep neural networks (DNNs)—multilayer perceptrons with many hidden layers. DNNs play an important role in the modern speech recognition systems, and are the focus of the rest of the book. We depict the architecture of DNNs, describe the popular activation functions and training criteria, illustrate the famous ...
Dong Yu, Li Deng
openaire +2 more sources
In this chapter, we introduce deep neural networks (DNNs)—multilayer perceptrons with many hidden layers. DNNs play an important role in the modern speech recognition systems, and are the focus of the rest of the book. We depict the architecture of DNNs, describe the popular activation functions and training criteria, illustrate the famous ...
Dong Yu, Li Deng
openaire +2 more sources
2019
We will implement a multi-layered neural network with different hyperparameters Hidden layer activations Hidden layer nodes Output layer activation Learning rate Mini-batch size Initialization Value of \(\beta \) Values of \(\beta _1\) Value of \(\beta _2\) Value of \(\epsilon \) Value of keep_prob
openaire +3 more sources
We will implement a multi-layered neural network with different hyperparameters Hidden layer activations Hidden layer nodes Output layer activation Learning rate Mini-batch size Initialization Value of \(\beta \) Values of \(\beta _1\) Value of \(\beta _2\) Value of \(\epsilon \) Value of keep_prob
openaire +3 more sources