Results 11 to 20 of about 788,437 (264)

Robust learning with implicit residual networks [PDF]

open access: yesMachine Learning and Knowledge Extraction, 2020
In this effort, we propose a new deep architecture utilizing residual blocks inspired by implicit discretization schemes. As opposed to the standard feed-forward networks, the outputs of the proposed implicit residual blocks are defined as the fixed ...
Reshniak, Viktor, Webster, Clayton
core   +3 more sources

Deep Residual Reinforcement Learning

open access: yes, 2020
We revisit residual algorithms in both model-free and model-based reinforcement learning settings. We propose the bidirectional target network technique to stabilize residual algorithms, yielding a residual version of DDPG that significantly outperforms ...
Boehmer, Wendelin   +2 more
core   +3 more sources

Deep Residual Learning for Nonlinear Regression. [PDF]

open access: yesEntropy (Basel), 2020
Deep learning plays a key role in the recent developments of machine learning. This paper develops a deep residual neural network (ResNet) for the regression of nonlinear functions. Convolutional layers and pooling layers are replaced by fully connected layers in the residual block.
Chen D, Hu F, Nian G, Yang T.
europepmc   +5 more sources

Residue–Residue Interaction Prediction via Stacked Meta-Learning [PDF]

open access: yesInternational Journal of Molecular Sciences, 2021
Protein–protein interactions (PPIs) are the basis of most biological functions determined by residue–residue interactions (RRIs). Predicting residue pairs responsible for the interaction is crucial for understanding the cause of a disease and drug design.
Chen, Kuan-Hsi, Hu, Yuh-Jyh
openaire   +2 more sources

Knowledge-based Residual Learning [PDF]

open access: yesProceedings of the Thirtieth International Joint Conference on Artificial Intelligence, 2021
Small data has been a barrier for many machine learning tasks, especially when applied in scientific domains. Fortunately, we can utilize domain knowledge to make up the lack of data. Hence, in this paper, we propose a hybrid model KRL that treats domain knowledge model as a weak learner and uses another neural net model to boost it.
Guanjie Zheng   +6 more
openaire   +1 more source

Collaborative Residual Metric Learning

open access: yesProceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2023
Accepted by SIGIR ...
Tianjun Wei   +2 more
openaire   +2 more sources

Residual Continual Learning

open access: yesProceedings of the AAAI Conference on Artificial Intelligence, 2020
We propose a novel continual learning method called Residual Continual Learning (ResCL). Our method can prevent the catastrophic forgetting phenomenon in sequential learning of multiple tasks, without any source task information except the original network.
Lee, Janghyeon   +3 more
openaire   +3 more sources

Residuality and Learning for Nondeterministic Nominal Automata [PDF]

open access: yesLogical Methods in Computer Science, 2022
We are motivated by the following question: which data languages admit an active learning algorithm? This question was left open in previous work by the authors, and is particularly challenging for languages recognised by nondeterministic automata. To answer it, we develop the theory of residual nominal automata, a subclass of nondeterministic nominal ...
Moerman, Joshua, Sammartino, Matteo
openaire   +7 more sources

Shakedrop Regularization for Deep Residual Learning [PDF]

open access: yesIEEE Access, 2019
Overfitting is a crucial problem in deep neural networks, even in the latest network architectures. In this paper, to relieve the overfitting effect of ResNet and its improvements (i.e., Wide ResNet, PyramidNet, and ResNeXt), we propose a new regularization method called ShakeDrop regularization.
Yoshihiro Yamada   +3 more
openaire   +3 more sources

ResDepth: Learned Residual Stereo Reconstruction [PDF]

open access: yes2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2020
We propose an embarrassingly simple but very effective scheme for high-quality dense stereo reconstruction: (i) generate an approximate reconstruction with your favourite stereo matcher; (ii) rewarp the input images with that approximate model; (iii) with the initial reconstruction and the warped images as input, train a deep network to enhance the ...
Stucker, Corinne, Schindler, Konrad
openaire   +2 more sources

Home - About - Disclaimer - Privacy