Results 51 to 60 of about 557,874 (331)
Change in Cognition Following Ischaemic Stroke
ABSTRACT Objective Cognitive decline can occur following ischaemic stroke. How cognition changes over time and associations with cognitive change are poorly understood. This study aimed to explore these issues over 2 years following ischaemic stroke.
Wenci Yan +8 more
wiley +1 more source
Markov Processes on Curves [PDF]
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Mazin G. Rahim, Lawrence K. Saul
openaire +2 more sources
On the stochastic mechanics of the free relativistic particle
Given a positive energy solution of the Klein-Gordon equation, the motion of the free, spinless, relativistic particle is described in a fixed Lorentz frame by a Markov diffusion process with non-constant diffusion coefficient.
Carlen E. +9 more
core +1 more source
Molecular dynamics simulations are advancing the study of ribonucleic acid (RNA) and RNA‐conjugated molecules. These developments include improvements in force fields, long‐timescale dynamics, and coarse‐grained models, addressing limitations and refining methods.
Kanchan Yadav, Iksoo Jang, Jong Bum Lee
wiley +1 more source
Controllable Summarization with Constrained Markov Decision Process
We study controllable text summarization, which allows users to gain control on a particular attribute (e.g., length limit) of the generated summaries. In this work, we propose a novel training framework based on Constrained Markov Decision Process (CMDP)
Hou Pong Chan, Lu Wang, Irwin King
doaj +1 more source
AbstractThe concept of a limiting conditional age distribution of a continuous time Markov process whose state space is the set of non-negative integers and for which {0} is absorbing is defined as the weak limit as t→∞ of the last time before t an associated “return” Markov process exited from {0} conditional on the state, j, of this process at t.
openaire +2 more sources
Markov processes follow from the principle of Maximum Caliber
Markov models are widely used to describe processes of stochastic dynamics. Here, we show that Markov models are a natural consequence of the dynamical principle of Maximum Caliber.
Brockwell P. J. +6 more
core +1 more source
Recycling of Thermoplastics with Machine Learning: A Review
This review shows how machine learning is revolutionizing mechanical, chemical, and biological pathways, overcoming traditional challenges and optimizing sorting, efficiency, and quality. It provides a detailed analysis of effective feature engineering strategies and establishes a forward‐looking research agenda for a truly circular thermoplastic ...
Rodrigo Q. Albuquerque +5 more
wiley +1 more source
Vacation Policy for k-out-of-n Redundant System with Reboot Delay [PDF]
Redundancy is a well-known concept for system resilience; k-out-of-n redundancy stipulates that a minimum number of functional components must be present for the system to function.
Vaishali Tyagi +3 more
doaj +1 more source
Exact finite approximations of average-cost countable Markov Decision Processes [PDF]
For a countable-state Markov decision process we introduce an embedding which produces a finite-state Markov decision process. The finite-state embedded process has the same optimal cost, and moreover, it has the same dynamics as the original process ...
Leizarowitz, Arie, Shwartz, Adam
core

