Results 31 to 40 of about 1,637,948 (328)

Some Upper Bounds for RKHS Approximation by Bessel Functions

open access: yesAxioms, 2022
A reproducing kernel Hilbert space (RKHS) approximation problem arising from learning theory is investigated. Some K-functionals and moduli of smoothness with respect to RKHSs are defined with Fourier–Bessel series and Fourier–Bessel transforms ...
Mingdang Tian   +2 more
doaj   +1 more source

Nonlinear tensor product approximation of functions [PDF]

open access: yes, 2014
We are interested in approximation of a multivariate function $f(x_1,\dots,x_d)$ by linear combinations of products $u^1(x_1)\cdots u^d(x_d)$ of univariate functions $u^i(x_i)$, $i=1,\dots,d$.
Bazarkhanov, D., Temlyakov, V.
core   +1 more source

Rates of approximation by ReLU shallow neural networks

open access: yesJournal of Complexity, 2023
Neural networks activated by the rectified linear unit (ReLU) play a central role in the recent development of deep learning. The topic of approximating functions from Hölder spaces by these networks is crucial for understanding the efficiency of the induced learning algorithms.
Mao, Tong, Zhou, Ding-Xuan
openaire   +3 more sources

Approximation in variation by the Kantorovich operators; pp. 201–209 [PDF]

open access: yesProceedings of the Estonian Academy of Sciences, 2011
We discuss the rate of approximation of the Kantorovich operators. The rate of approximation is given with respect to the variation seminorm.
Andi Kivinukk, Tarmo Metsmägi
doaj   +1 more source

Multipoint Padé-Type Approximants. Exact Rate of Convergence [PDF]

open access: yesConstructive Approximation, 1997
14 pages, no figures.-- MSC1991 codes: Primary 41A21, 41A25; Secondary 30E10, 42C05. MR#: MR1606915 (99i:41016) Zbl#: Zbl 0896.41010 We study the rate with which sequences of interpolating rational functions, whose poles are partially fixed, approximate Markov-type analytic functions. Applications to interpolating quadratures are given.
Cala Rodríguez, F.   +1 more
openaire   +3 more sources

Neural network approximation [PDF]

open access: yesActa Numerica, 2020
Neural networks (NNs) are the method of choice for building learning algorithms. They are now being investigated for other numerical tasks such as solving high-dimensional partial differential equations.
R. DeVore, Boris Hanin, G. Petrova
semanticscholar   +1 more source

Pointwise approximation of modified conjugate functions by matrix operators of conjugate Fourier series of 2π/r $2\pi /r$-periodic functions

open access: yesJournal of Inequalities and Applications, 2018
We extend the results of Xh. Z. Krasniqi (Acta Comment. Univ. Tartu Math. 17:89–101, 2013) and the authors (Acta Comment. Univ. Tartu Math. 13:11–24, 2009; Proc. Est. Acad. Sci.
Mateusz Kubiak   +2 more
doaj   +1 more source

Exact estimates of the rate of approximation of convolution operators

open access: yesJournal of Approximation Theory, 2010
The paper presents a method for establishing direct and strong converse inequalities in terms of K-functionals for convolution operators acting in homogeneous Banach spaces of multivariate functions.
B. Draganov
semanticscholar   +1 more source

Deep Network Approximation for Smooth Functions [PDF]

open access: yesSIAM Journal on Mathematical Analysis, 2020
This paper establishes optimal approximation error characterization of deep ReLU networks for smooth functions in terms of both width and depth simultaneously.
Jianfeng Lu   +3 more
semanticscholar   +1 more source

On the convergence rates of Gauss and Clenshaw-Curtis quadrature for functions of limited regularity [PDF]

open access: yes, 2012
We study the optimal general rate of convergence of the n-point quadrature rules of Gauss and Clenshaw-Curtis when applied to functions of limited regularity: if the Chebyshev coefficients decay at a rate O(n^{-s-1}) for some s > 0, Clenshaw-Curtis and ...
Bornemann, Folkmar, Xiang, Shuhuang
core   +2 more sources

Home - About - Disclaimer - Privacy