CNNs Avoid the Curse of Dimensionality by Learning on Patches [PDF]
Despite the success of convolutional neural networks (CNNs) in numerous computer vision tasks and their extraordinary generalization performances, several attempts to predict the generalization errors of CNNs have only been limited to a posteriori ...
Vamshi C. Madala +2 more
semanticscholar +1 more source
Deep relu neural networks overcome the curse of dimensionality for partial integrodifferential equations [PDF]
Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$.
Lukas Gonon, C. Schwab
semanticscholar +1 more source
Mining pure, strict epistatic interactions from high-dimensional datasets: ameliorating the curse of dimensionality. [PDF]
Jiang X, Neapolitan RE.
europepmc +3 more sources
Theory I: Deep networks and the curse of dimensionality
Deep Learning references start with Hinton’s backpropagation and with Lecun’s convolutional networks (see for a review [8]). Of course, multilayer convolutional networks have been around at least as far back as the optical processing era of the 1970s ...
T. Poggio, Q. Liao
semanticscholar +1 more source
Curse of Dimensionality for TSK Fuzzy Neural Networks: Explanation and Solutions [PDF]
Takagi-Sugeno-Kang (TSK) fuzzy system with Gaussian membership functions (MFs) is one of the most widely used fuzzy systems in machine learning. However, it usually has difficulty handling high-dimensional datasets.
Yuqi Cui, Dongrui Wu, Yifan Xu
semanticscholar +1 more source
Finite-Sample Guarantees for Wasserstein Distributionally Robust Optimization: Breaking the Curse of Dimensionality [PDF]
Wasserstein distributionally robust optimization is a recent emerging modeling paradigm for decision making under data uncertainty. Because of its computational tractability and interpretability, it has achieved great empirical successes across several ...
Rui Gao
semanticscholar +1 more source
Locality defeats the curse of dimensionality in convolutional teacher–student scenarios [PDF]
Convolutional neural networks perform a local and translationally-invariant treatment of the data: quantifying which of these two aspects is central to their success remains a challenge. We study this problem within a teacher–student framework for kernel
Alessandro Favero +2 more
semanticscholar +1 more source
Can Shallow Neural Networks Beat the Curse of Dimensionality? A Mean Field Training Perspective [PDF]
We prove that the gradient descent training of a two-layer neural network on empirical or population risk may not decrease population risk at an order faster than $t^{-4/(d-2)}$ under mean field scaling.
Stephan Wojtowytsch, E. Weinan
semanticscholar +1 more source
Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions
We prove a theorem concerning the approximation of generalized bandlimited multivariate functions by deep ReLU networks for which the curse of the dimensionality is overcome.
Hadrien Montanelli
semanticscholar +1 more source
Approximate Nearest Neighbor: Towards Removing the Curse of Dimensionality
We present two algorithms for the approximate nearest neighbor problem in high dimensional spaces. For data sets of size n living in IR d , the algorithms require space that is only polynomial in n and d , while achieving query times that are sub-linear ...
Sariel Har-Peled, P. Indyk, R. Motwani
semanticscholar +1 more source

