Results 271 to 280 of about 7,468,732 (318)
Some of the next articles are maybe not open access.

INFERENCE OF GENE REGULATORY NETWORKS USING BOOLEAN-NETWORK INFERENCE METHODS

Journal of Bioinformatics and Computational Biology, 2009
The modeling of genetic networks especially from microarray and related data has become an important aspect of the biosciences. This review takes a fresh look at a specific family of models used for constructing genetic networks, the so-called Boolean networks.
Graham J, Hickman, T Charlie, Hodgman
openaire   +2 more sources

Learning Layer-Skippable Inference Network

IEEE Transactions on Image Processing, 2020
The process of learning good representations for machine learning tasks can be very computationally expensive. Typically, we facilitate the same backbones learned on the training set to infer the labels of testing data. Interestingly, This learning and inference paradigm, however, is quite different from the typical inference scheme of human biological
Yu-Gang Jiang   +3 more
openaire   +2 more sources

Panther: Practical Secure Two-Party Neural Network Inference

IEEE Transactions on Information Forensics and Security
Secure two-party neural network (2P-NN) inference allows the server with a neural network model and the client with inputs to perform neural network inference without revealing their private data to each other.
Jun Feng   +4 more
semanticscholar   +1 more source

Epidemiologic network inference

Statistics and Computing, 2019
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Barbillon, Pierre   +4 more
openaire   +4 more sources

Deep compressive offloading: speeding up neural network inference by trading edge computation for network latency

ACM International Conference on Embedded Networked Sensor Systems, 2020
With recent advances, neural networks have become a crucial building block in intelligent IoT systems and sensing applications. However, the excessive computational demand remains a serious impediment to their deployments on low-end IoT devices. With the
Shuochao Yao   +6 more
semanticscholar   +1 more source

Nonmonotonic Inferences and Neural Networks

Synthese, 2004
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire   +4 more sources

A 0.32–128 TOPS, Scalable Multi-Chip-Module-Based Deep Neural Network Inference Accelerator With Ground-Referenced Signaling in 16 nm

IEEE Journal of Solid-State Circuits, 2020
Custom accelerators improve the energy efficiency, area efficiency, and performance of deep neural network (DNN) inference. This article presents a scalable DNN accelerator consisting of 36 chips connected in a mesh network on a multi-chip-module (MCM ...
B. Zimmer   +16 more
semanticscholar   +1 more source

Inference Networks for Document Retrieval

ACM SIGIR Forum, 1989
The use of inference networks to support document retrieval is introduced. A network-basead retrieval model is described and compared to conventional probabilistic and Boolean models.
H. Turtle, W. B. Croft
openaire   +1 more source

Weight-Oriented Approximation for Energy-Efficient Neural Network Inference Accelerators

IEEE Transactions on Circuits and Systems Part 1: Regular Papers, 2020
Current research in the area of Neural Networks (NN) has resulted in performance advancements for a variety of complex problems. Especially, embedded system applications rely more and more on the utilization of convolutional NNs to provide services such ...
Zois-Gerasimos Tasoulas   +4 more
semanticscholar   +1 more source

EDEN: Enabling Energy-Efficient, High-Performance Deep Neural Network Inference Using Approximate DRAM

Micro, 2019
The effectiveness of deep neural networks (DNN) in vision, speech, and language processing has prompted a tremendous demand for energy-efficient high-performance DNN inference systems.
Skanda Koppula   +6 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy