Approximation of high-dimensional parametric PDEs [PDF]
Parametrized families of PDEs arise in various contexts such as inverse problems, control and optimization, risk assessment, and uncertainty quantification. In most of these applications, the number of parameters is large or perhaps even infinite.
Cohen, Albert, Devore, Ronald
core +4 more sources
A machine learning approach to portfolio pricing and risk management for high-dimensional problems
We present a general framework for portfolio risk management in discrete time, based on a replicating martingale. This martingale is learned from a finite sample in a supervised setting.
Fernandez-Arjona, Lucio +1 more
core +1 more source
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review [PDF]
The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning.
T. Poggio +4 more
semanticscholar +1 more source
Sound data analysis is critical to the success of modern molecular medicine research that involves collection and interpretation of mass-throughput data. The novel nature and high-dimensionality in such datasets pose a series of non-trivial data analysis
Constantin F. Aliferis +2 more
doaj +2 more sources
Limit Theorems as Blessing of Dimensionality: Neural-Oriented Overview
As a system becomes more complex, at first, its description and analysis becomes more complicated. However, a further increase in the system’s complexity often makes this analysis simpler.
Vladik Kreinovich, Olga Kosheleva
doaj +1 more source
Can local particle filters beat the curse of dimensionality?
The discovery of particle filtering methods has enabled the use of nonlinear filtering in a wide array of applications. Unfortunately, the approximation error of particle filters typically grows exponentially in the dimension of the underlying model ...
Rebeschini, Patrick, van Handel, Ramon
core +1 more source
Overcoming the Curse of Dimensionality in the Numerical Approximation of Parabolic Partial Differential Equations with Gradient-Dependent Nonlinearities [PDF]
Partial differential equations (PDEs) are a fundamental tool in the modeling of many real-world phenomena. In a number of such real-world phenomena the PDEs under consideration contain gradient-dependent nonlinearities and are high-dimensional. Such high-
Martin Hutzenthaler +2 more
semanticscholar +1 more source
This schematic integrates the eight statistically significant causal relationships identified between 1,366 brain imaging‐derived phenotypes (IDPs) and 18 autoimmune inflammatory diseases (AIDs). Arrows indicate the direction of causality inferred from bidirectional two‐sample MR analyses.
Jinbin Chen +8 more
wiley +1 more source
Outlier Detection Based Feature Selection Exploiting Bio-Inspired Optimization Algorithms
The curse of dimensionality problem occurs when the data are high-dimensional. It affects the learning process and reduces the accuracy. Feature selection is one of the dimensionality reduction approaches that mainly contribute to solving the curse of ...
Souad Larabi-Marie-Sainte
doaj +1 more source
A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients [PDF]
In recent years deep artificial neural networks (DNNs) have been successfully employed in numerical simulations for a multitude of computational problems including, for example, object and face recognition, natural language processing, fraud detection ...
Arnulf Jentzen +2 more
semanticscholar +1 more source

