Results 21 to 30 of about 3,311,038 (365)

A Liouville-Type Theorem for an Elliptic Equation with Superquadratic Growth in the Gradient

open access: yesAdvanced Nonlinear Studies, 2020
We consider the elliptic equation -Δ⁢u=uq⁢|∇⁡u|p{-\Delta u=u^{q}|\nabla u|^{p}} in ℝn{\mathbb{R}^{n}} for any p>2{p>2} and q>0{q>0}. We prove a Liouville-type theorem, which asserts that any positive bounded solution is constant.
Filippucci Roberta   +2 more
doaj   +1 more source

Confirming the Unusual Temperature Dependence of the Electric-Field Gradient in Zn

open access: yesCrystals, 2022
The electric-field gradient (EFG) at nuclei in solids is a sensitive probe of the charge distribution. Experimental data, which previously only existed in insulators, have been available for metals with the development of nuclear measuring techniques ...
Heinz Haas   +9 more
doaj   +1 more source

AMBER/VLTI observations of the B[e] star MWC 300 [PDF]

open access: yes, 2012
Aims. We study the enigmatic B[e] star MWC 300 to investigate its disk and binary with milli-arcsecond-scale angular resolution. Methods. We observed MWC 300 with the VLTI/AMBER instrument in the H and K bands and compared these observations with ...
Chelli, A.   +10 more
core   +4 more sources

Polarization-dependent tunneling of light in gradient optics [PDF]

open access: yesPhysical Review E, 2007
Reflection-refraction properties of photonic barriers, formed by dielectric gradient nanofilms, for inclined incidence of both S- and P-polarized electromagnetic (EM) waves are examined by means of exactly solvable models. We present generalized Fresnel formulae, describing the influence of the non-local dispersion on reflectance and transmittance of ...
Shvartsburg, Alexander   +2 more
openaire   +3 more sources

Training Neural Networks by Time-Fractional Gradient Descent

open access: yesAxioms, 2022
Motivated by the weighted averaging method for training neural networks, we study the time-fractional gradient descent (TFGD) method based on the time-fractional gradient flow and explore the influence of memory dependence on neural network training. The
Jingyi Xie, Sirui Li
doaj   +1 more source

Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains [PDF]

open access: yes, 2018
We consider the minimization of an objective function given access to unbiased estimates of its gradient through stochastic gradient descent (SGD) with constant step-size. While the detailed analysis was only performed for quadratic functions, we provide
Bach, Francis   +2 more
core   +3 more sources

Revisiting Gradient Clipping: Stochastic bias and tight convergence guarantees [PDF]

open access: yesInternational Conference on Machine Learning, 2023
Gradient clipping is a popular modification to standard (stochastic) gradient descent, at every iteration limiting the gradient norm to a certain value $c>0$.
Anastasia Koloskova   +2 more
semanticscholar   +1 more source

Non-local gradient dependent operators

open access: yesAdvances in Mathematics, 2012
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Bjorland, C.   +2 more
openaire   +1 more source

Impurity transport in temperature gradient driven turbulence [PDF]

open access: yes, 2011
In the present paper the transport of impurities driven by trapped electron (TE) mode turbulence is studied. Non-linear (NL) gyrokinetic simulations using the code GENE are compared with results from quasilinear (QL) gyrokinetic simulations and a ...
A. Skyman   +3 more
core   +2 more sources

Robin problems for the p-Laplacian with gradient dependence

open access: yesDiscrete and Continuous Dynamical Systems. Series A, 2019
We consider a nonlinear elliptic equation with Robin boundary condition driven by the p-Laplacian and with a reaction term which depends also on the gradient.
G. Fragnelli   +2 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy