Results 71 to 80 of about 58,822 (196)
Covariance Structure Modeling of Engineering Demand Parameters in Cloud‐Based Seismic Analysis
ABSTRACT Probabilistic seismic demand modeling aims to estimate structural demand as a function of ground motion intensity—a critical stage in seismic risk assessment. Although many models exist to describe the structural demand, few consider the covariance among engineering demand parameters, potentially overlooking a key factor in improving the ...
Archie Rudman +3 more
wiley +1 more source
Balancing Reconstruction Error and Kullback-Leibler Divergence in Variational Autoencoders
Likelihood-based generative frameworks are receiving increasing attention in the deep learning community, mostly on account of their strong probabilistic foundation.
Andrea Asperti, Matteo Trentin
doaj +1 more source
A Deep Learning Framework for Forecasting Medium‐Term Covariance in Multiasset Portfolios
ABSTRACT Forecasting the covariance matrix of asset returns is central to portfolio construction, risk management, and asset pricing. However, most existing models struggle at medium‐term horizons, several weeks to months, where shifting market regimes and slower dynamics prevail.
Pedro Reis, Ana Paula Serra, João Gama
wiley +1 more source
The McMillan Theorem for Colored Branching Processes and Dimensions of Random Fractals
For the simplest colored branching process, we prove an analog to the McMillan theorem and calculate the Hausdorff dimensions of random fractals defined in terms of the limit behavior of empirical measures generated by finite genetic lines.
Victor Bakhtin
doaj +1 more source
ABSTRACT This study examines the environmental, social and governance (ESG) scoring methodologies used by Bloomberg and S&P Global through the lens of Data Envelopment Analysis (DEA). It addresses a notable gap in the literature by identifying the underlying factors that shape ESG scores and providing practical insights for companies seeking to ...
Philipe Balan +4 more
wiley +1 more source
The Kullback-Leibler Divergence as a Lyapunov Function for Incentive Based Game Dynamics [PDF]
It has been shown that the Kullback-Leibler divergence is a Lyapunov function for the replicator equations at evolutionary stable states, or ESS. In this paper we extend the result to a more general class of game dynamics.
Fryer, Dashiell E. A.
core
Fendioxypyracil, a new and systemic PPO‐inhibiting herbicide for X‐spectrum weed control
This graphical abstract presents the discovery and synthesis of PPO herbicide structures with a central pyridine core, showing molecular conformations, dose–response inhibition curves for PPO1 and PPO2, and comparative weed and grass control efficacy of fendioxypyracil versus other herbicides in greenhouse and field trials.
Tobias Seiser +8 more
wiley +1 more source
Loss Behavior in Supervised Learning With Entangled States
Entanglement in training samples supports quantum supervised learning algorithm in obtaining solutions of low generalization error. Using analytical as well as numerical methods, this work shows that the positive effect of entanglement on model after training has negative consequences for the trainability of the model itself, while showing the ...
Alexander Mandl +4 more
wiley +1 more source
Efficient ECG classification based on the probabilistic Kullback-Leibler divergence
Diagnostic systems of cardiac arrhythmias face early and accurate detection challenges due to the overlap of electrocardiogram (ECG) patterns. Additionally, these systems must manage a huge number of features.
Dhiah Al-Shammary +5 more
doaj +1 more source
ABSTRACT Safe and reliable mobility over different kinds of ground is important for planetary rovers on space missions. Since terrain changes might affect the mobility of the rover, energy consumption, and safety, detecting the type of ground in real‐time is vital.
Md Masrul Khan +7 more
wiley +1 more source

