Results 71 to 80 of about 55,876 (328)
This protocol paper outlines methods to establish the success of a time‐resolved serial crystallographic experiment, by means of statistical analysis of timepoint data in reciprocal space and models in real space. We show how to amplify the signal from excited states to visualise structural changes in successful experiments.
Jake Hill +4 more
wiley +1 more source
Hash Tables as Engines of Randomness at the Limits of Computation: A Unified Review of Algorithms
Hash tables embody a paradox of deterministic structure that emerges from controlled randomness. They have evolved from simple associative arrays into algorithmic engines that operate near the physical and probabilistic limits of computation. This review
Paul A. Gagniuc, Mihai Togan
doaj +1 more source
Ultrafast photonic reinforcement learning based on laser chaos
Reinforcement learning involves decision making in dynamic and uncertain environments and constitutes an important element of artificial intelligence (AI).
Makoto Naruse +3 more
doaj +1 more source
How powerful are integer-valued martingales?
In the theory of algorithmic randomness, one of the central notions is that of computable randomness. An infinite binary sequence X is computably random if no recursive martingale (strategy) can win an infinite amount of money by betting on the values of
Bienvenu, Laurent +2 more
core +1 more source
Activation of the mitochondrial protein OXR1 increases pSyn129 αSynuclein aggregation by lowering ATP levels and altering mitochondrial membrane potential, particularly in response to MSA‐derived fibrils. In contrast, ablation of the ER protein EMC4 enhances autophagic flux and lysosomal clearance, broadly reducing α‐synuclein aggregates.
Sandesh Neupane +11 more
wiley +1 more source
Randomness and differentiability in higher dimensions [PDF]
We present two theorems concerned with algorithmic randomness and differentiability of functions of several variables. Firstly, we prove an effective form of the Rademacher's Theorem: we show that computable randomness implies differentiability of ...
Galicki, Alex, Turetsky, Daniel
core
The Tsallis entropy and the Shannon entropy of a universal probability
We study the properties of Tsallis entropy and Shannon entropy from the point of view of algorithmic randomness. In algorithmic information theory, there are two equivalent ways to define the program-size complexity K(s) of a given finite binary string s.
Tadaki, Kohtaro
core +1 more source
Evolutionary analysis across 32 placental mammals identified positive selection at residues H148 and W149 in the immune receptor FcγR1. Ancestral reconstruction combined with molecular dynamics simulations reveals how these mutations may influence receptor structure and dynamics, providing insight into the evolution of antibody recognition and immune ...
David A. Young +7 more
wiley +1 more source
On Resource-bounded versions of the van Lambalgen theorem
The van Lambalgen theorem is a surprising result in algorithmic information theory concerning the symmetry of relative randomness. It establishes that for any pair of infinite sequences $A$ and $B$, $B$ is Martin-L\"of random and $A$ is Martin-L\"of ...
A Nies +8 more
core +1 more source
RoundMi: A quantitative method to analyze mitochondrial morphology in mitotic cells
RoundMi is a workflow for rapid analysis of mitochondrial morphology in mitotic cells. By combining adaptive preprocessing with automated segmentation and quantification, it enables accurate measurements from single focal plane images, reducing acquisition time and computational demands while remaining compatible with high‐throughput fixed and live ...
Elmira Parvindokht Bararpour +2 more
wiley +1 more source

