Results 11 to 20 of about 230,209 (328)
Manifolds of difference polynomials [PDF]
1. It is the purpose of this paper to develop in some detail the structure of the manifolds determined by systems of difference polynomials. Our results will necessarily be confined to the case of polynomials in an abstract field, since a suitable existence theorem for analytic difference equations is not available. The ideal theory, developed by J. F.
Richard M. Cohn
openalex +4 more sources
An existence theorem for difference polynomials [PDF]
Introduction. The abstract varieties (also called manifolds) of difference algebra [2], [3] have not heretofore had a realization as sets of functions comparable to the realization provided for differential manifolds by the analytic existence theorem for differential equations (see [4], particularly p. 23).
Richard M. Cohn
openalex +4 more sources
On the generalized difference polynomials [PDF]
Laurenţiu Panaitopol, D. M. Stefanescu
openalex +3 more sources
On the difference and sum of basic sets of polynomials [PDF]
M.N. Mikhail, M. Nassif
openalex +4 more sources
Differential-Difference Properties of Hypergeometric Polynomials [PDF]
We develop differential-difference properties of a class of hypergeometric polynomials which are a generalization of the Jacobi polynomials. The formulas are analogous to known formulas for the classical orthogonal polynomials.
Jet Wimp
openalex +4 more sources
On a difference equation for generalizations of Charlier polynomials
11 ...
H. Bavinck, Roelof Koekoek
openalex +5 more sources
Summary In this contribution, we propose a detailed study of interpolation‐based data‐driven methods that are of relevance in the model reduction and also in the systems and control communities. The data are given by samples of the transfer function of the underlying (unknown) model, that is, we analyze frequency‐response data.
Quirin Aumann, Ion Victor Gosea
wiley +1 more source
Data‐driven performance metrics for neural network learning
Summary Effectiveness of data‐driven neural learning in terms of both local mimima trapping and convergence rate is addressed. Such issues are investigated in a case study involving the training of one‐hidden‐layer feedforward neural networks with the extended Kalman filter, which reduces the search for the optimal network parameters to a state ...
Angelo Alessandri+2 more
wiley +1 more source
Zeros of difference polynomials
Studies --- both analytic and numerical --- on polynomials have been of immense interest for long. Here the authors deal in detail with various questions relating to the zeros of difference polynomials. Particularly, defining the difference operator by \(\Delta f(x)=f(x+1)-f(x)\), the polynomial \(\Delta^ mx^ n\) of degree \((n-m)\) having \((n-m ...
John J. Warvik, Ronald J. Evans
openaire +3 more sources
Polynomial Differences in the Primes [PDF]
We establish, utilizing the Hardy-Littlewood Circle Method, an asymptotic formula for the number of pairs of primes whose differences lie in the image of a fixed polynomial. We also include a generalization of this result where differences are replaced with any integer linear combination of two primes.
Alex Rice, Neil Lyall
openaire +3 more sources