Results 41 to 50 of about 10,137 (199)
Bayesian models allow us to investigate children’s belief revision alongside physiological states, such as “surprise”. Recent work finds that pupil dilation (or the “pupillary surprise response”) following expectancy violations is predictive of belief ...
Joseph Colantonio +4 more
doaj +1 more source
S3RL: Enhancing Spatial Single‐Cell Transcriptomics With Separable Representation Learning
Separable Spatial Representation Learning (S3RL) is introduced to enhance the reconstruction of spatial transcriptomic landscapes by disentangling spatial structure and gene expression semantics. By integrating multimodal inputs with graph‐based representation learning and hyperspherical prototype modeling, S3RL enables high‐fidelity spatial domain ...
Laiyi Fu +6 more
wiley +1 more source
The Kullback–Leibler Divergence Between Lattice Gaussian Distributions [PDF]
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
Learnable Diffusion Framework for Mouse V1 Neural Decoding
We introduce Sensorium‐Viz, a diffusion‐based framework for reconstructing high‐fidelity visual stimuli from mouse primary visual cortex activity. By integrating a novel spatial embedding module with a Diffusion Transformer (DiT) and a synthetic‐response augmentation strategy, our model outperforms state‐of‐the‐art fMRI‐based baselines, enabling robust
Kaiwen Deng +2 more
wiley +1 more source
Utilizing Amari-Alpha Divergence to Stabilize the Training of Generative Adversarial Networks
Generative Adversarial Nets (GANs) are one of the most popular architectures for image generation, which has achieved significant progress in generating high-resolution, diverse image samples. The normal GANs are supposed to minimize the Kullback–Leibler
Likun Cai +4 more
doaj +1 more source
Spintronic Bayesian Hardware Driven by Stochastic Magnetic Domain Wall Dynamics
Magnetic Probabilistic Computing (MPC) utilizes intrinsic stochastic dynamics in domain walls to establish a hardware foundation for uncertainty‐aware artificial intelligence. Thermally driven domain‐wall fluctuations, voltage‐controlled magnetic anisotropy, and TMR readout enable fully electrical, tunable probabilistic inference.
Tianyi Wang +11 more
wiley +1 more source
Distributed Vector Quantization Based on Kullback-Leibler Divergence
The goal of vector quantization is to use a few reproduction vectors to represent original vectors/data while maintaining the necessary fidelity of the data.
Pengcheng Shen +2 more
doaj +1 more source
This review comprehensively summarizes the atomic defects in TMDs for their applications in sustainable energy storage devices, along with the latest progress in ML methodologies for high‐throughput TEM data analysis, offering insights on how ML‐empowered microscopy facilitates bridging structure–property correlation and inspires knowledge for precise ...
Zheng Luo +6 more
wiley +1 more source
On the symmetrized S-divergence [PDF]
In this paper we worked with the relative divergence of type s, s ∈ ℝ, which include Kullback-Leibler divergence and the Hellinger and χ2 distances as particular cases.
Simić Slavko
doaj +1 more source
A Scalable Framework for Comprehensive Typing of Polymorphic Immune Genes from Long‐Read Data
SpecImmune introduces a unified computational framework optimized for long‐read sequencing to resolve over 400 highly polymorphic immune genes. This scalable approach achieves high‐resolution typing, enabling the discovery of cross‐family co‐evolutionary networks and population‐specific diversity.
Shuai Wang +5 more
wiley +1 more source

