Results 171 to 180 of about 61,087 (328)
Entropy Production and Irreversibility in the Linearized Stochastic Amari Neural Model. [PDF]
Lucente D, Gradenigo G, Salasnich L.
europepmc +1 more source
Navier-Stokes Equations and Forward-Backward Stochastic Differential Systems [PDF]
Freddy Delbaen +2 more
openalex
Harnessing Machine Learning to Understand and Design Disordered Solids
This review maps the dynamic evolution of machine learning in disordered solids, from structural representations to generative modeling. It explores how deep learning and model explainability transform property prediction into profound physical insight.
Muchen Wang, Yue Fan
wiley +1 more source
Random Neural Networks for Rough Volatility. [PDF]
Jacquier A, Žurič Ž.
europepmc +1 more source
Solution of forward-backward stochastic differential equations [PDF]
Ying Hu, Shigē Péng
openalex +1 more source
This article outlines how artificial intelligence could reshape the design of next‐generation transistors as traditional scaling reaches its limits. It discusses emerging roles of machine learning across materials selection, device modeling, and fabrication processes, and highlights hierarchical reinforcement learning as a promising framework for ...
Shoubhanik Nath +4 more
wiley +1 more source
$L^{p}$-Variational Solutions of Multivalued Backward Stochastic\n Differential Equations [PDF]
Lucian Maticiuc, Aurel Răşcanu
openalex +1 more source
Variational Autoencoder+Deep Deterministic Policy Gradient addresses low‐light failures of infrared depth sensing for indoor robot navigation. Stage 1 pretrains an attention‐enhanced Variational Autoencoder (Convolutional Block Attention Module+Feature Pyramid Network) to map dark depth frames to a well‐lit reconstruction, yielding a 128‐D latent code ...
Uiseok Lee +7 more
wiley +1 more source
Simulation and optimal control of stochastic delay differential models for hepatitis C virus epidemics. [PDF]
Kumar N +3 more
europepmc +1 more source
Calibration‐Free Electromyography Motor Intent Decoding Using Large‐Scale Supervised Pretraining
Calibration‐free electromyography motor intent decoding is enabled through large‐scale supervised pretraining across heterogeneous datasets. A Spatially Aware Feature‐learning Transformer processes variable channel counts and electrode geometries, allowing transfer across users and recording setups. On a held‐out benchmark, fine‐tuned cross‐user models
Alexander E. Olsson +3 more
wiley +1 more source

