Results 111 to 120 of about 3,876 (155)
Some of the next articles are maybe not open access.
2016
In this chapter we study the convergence of Newton’s method for nonlinear equations and nonlinear inclusions in a Banach space. Nonlinear mappings, which appear in the right-hand side of the equations, are not necessarily differentiable. Our goal is to obtain an approximate solution in the presence of computational errors.
openaire +1 more source
In this chapter we study the convergence of Newton’s method for nonlinear equations and nonlinear inclusions in a Banach space. Nonlinear mappings, which appear in the right-hand side of the equations, are not necessarily differentiable. Our goal is to obtain an approximate solution in the presence of computational errors.
openaire +1 more source
Resonance, 2002
If one follows in the footsteps of Newton, ‘almost all’ roads lead to Rome (or is it London?).
openaire +1 more source
If one follows in the footsteps of Newton, ‘almost all’ roads lead to Rome (or is it London?).
openaire +1 more source
Generalized Kleinman‐Newton method
Optimal Control Applications and Methods, 2018SummaryThis paper addresses the general problem of optimal linear control design subject to convex gain constraints. Classical approaches based exclusively on Riccati equations or linear matrix inequalities are unable to treat problems that incorporate feedback gain constraints, for instance, the reduced‐order (including static) output feedback control
José C. Geromel, Grace S. Deaecto
openaire +2 more sources
2013
Regression analyses are commonly used to analyze pharmacodynamic/ -kinetic data. Newton’s method is different from traditional regression analysis, because, instead of different parameters for a single function, entirely different functions are compared with one another.
Ton J. Cleophas, Aeilko H. Zwinderman
openaire +2 more sources
Regression analyses are commonly used to analyze pharmacodynamic/ -kinetic data. Newton’s method is different from traditional regression analysis, because, instead of different parameters for a single function, entirely different functions are compared with one another.
Ton J. Cleophas, Aeilko H. Zwinderman
openaire +2 more sources
1994
The search for solutions of the equation f(x) = 0 is ancient and methods that can solve equations of the form ax2 + bx + c = 0 are several thousand years old. In the 16th century, Italian mathematicians discovered methods for solving third and fourth degree polynomials.
openaire +1 more source
The search for solutions of the equation f(x) = 0 is ancient and methods that can solve equations of the form ax2 + bx + c = 0 are several thousand years old. In the 16th century, Italian mathematicians discovered methods for solving third and fourth degree polynomials.
openaire +1 more source
Proceedings of the third ACM symposium on Symbolic and algebraic computation - SYMSAC '76, 1976
The analytic concepts of approximation, convergence, differentiation, and Taylor series expansion are applied and interpreted in the context of an abstract power series domain. Newton's method is then shown to be applicable to solving for a power series root of a polynomial with power series coefficients, resulting in fast algorithms for a variety of ...
openaire +1 more source
The analytic concepts of approximation, convergence, differentiation, and Taylor series expansion are applied and interpreted in the context of an abstract power series domain. Newton's method is then shown to be applicable to solving for a power series root of a polynomial with power series coefficients, resulting in fast algorithms for a variety of ...
openaire +1 more source
GENERALIZATIONS OF NEWTON'S METHOD
Fractals, 2001We give a survey of the complex dynamics of various generalizations of Newton's method for finding a complex root of a polynomial of a single variable.
openaire +2 more sources
1999
Newton’s method is remarkable both for the simplicity of its principle — based on linear approximation, and for its efficiency — often a very rapid convergence. It is known, in practice, by two names, depending on the circumstances in which it is used. When finding successive approximations to the numerical solution of an equation: P(y)=0, it is called
openaire +1 more source
Newton’s method is remarkable both for the simplicity of its principle — based on linear approximation, and for its efficiency — often a very rapid convergence. It is known, in practice, by two names, depending on the circumstances in which it is used. When finding successive approximations to the numerical solution of an equation: P(y)=0, it is called
openaire +1 more source

