Results 11 to 20 of about 22,632 (246)

CNNs Avoid the Curse of Dimensionality by Learning on Patches [PDF]

open access: yesIEEE Open Journal of Signal Processing, 2022
Despite the success of convolutional neural networks (CNNs) in numerous computer vision tasks and their extraordinary generalization performances, several attempts to predict the generalization errors of CNNs have only been limited to a posteriori ...
Vamshi C. Madala   +2 more
semanticscholar   +1 more source

Deep relu neural networks overcome the curse of dimensionality for partial integrodifferential equations [PDF]

open access: yesAnalysis and Applications, 2021
Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$.
Lukas Gonon, C. Schwab
semanticscholar   +1 more source

Theory I: Deep networks and the curse of dimensionality

open access: yesBulletin of the Polish Academy of Sciences: Technical Sciences, 2023
Deep Learning references start with Hinton’s backpropagation and with Lecun’s convolutional networks (see for a review [8]). Of course, multilayer convolutional networks have been around at least as far back as the optical processing era of the 1970s ...
T. Poggio, Q. Liao
semanticscholar   +1 more source

Curse of Dimensionality for TSK Fuzzy Neural Networks: Explanation and Solutions [PDF]

open access: yesIEEE International Joint Conference on Neural Network, 2021
Takagi-Sugeno-Kang (TSK) fuzzy system with Gaussian membership functions (MFs) is one of the most widely used fuzzy systems in machine learning. However, it usually has difficulty handling high-dimensional datasets.
Yuqi Cui, Dongrui Wu, Yifan Xu
semanticscholar   +1 more source

Finite-Sample Guarantees for Wasserstein Distributionally Robust Optimization: Breaking the Curse of Dimensionality [PDF]

open access: yesOperational Research, 2020
Wasserstein distributionally robust optimization is a recent emerging modeling paradigm for decision making under data uncertainty. Because of its computational tractability and interpretability, it has achieved great empirical successes across several ...
Rui Gao
semanticscholar   +1 more source

Locality defeats the curse of dimensionality in convolutional teacher–student scenarios [PDF]

open access: yesNeural Information Processing Systems, 2021
Convolutional neural networks perform a local and translationally-invariant treatment of the data: quantifying which of these two aspects is central to their success remains a challenge. We study this problem within a teacher–student framework for kernel
Alessandro Favero   +2 more
semanticscholar   +1 more source

Can Shallow Neural Networks Beat the Curse of Dimensionality? A Mean Field Training Perspective [PDF]

open access: yesIEEE Transactions on Artificial Intelligence, 2020
We prove that the gradient descent training of a two-layer neural network on empirical or population risk may not decrease population risk at an order faster than $t^{-4/(d-2)}$ under mean field scaling.
Stephan Wojtowytsch, E. Weinan
semanticscholar   +1 more source

Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions

open access: yesJournal of Computational Mathematics, 2021
We prove a theorem concerning the approximation of generalized bandlimited multivariate functions by deep ReLU networks for which the curse of the dimensionality is overcome.
Hadrien Montanelli
semanticscholar   +1 more source

Approximate Nearest Neighbor: Towards Removing the Curse of Dimensionality

open access: yesTheory of Computing, 2012
We present two algorithms for the approximate nearest neighbor problem in high dimensional spaces. For data sets of size n living in IR d , the algorithms require space that is only polynomial in n and d , while achieving query times that are sub-linear ...
Sariel Har-Peled, P. Indyk, R. Motwani
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy