Results 11 to 20 of about 3,571 (240)
The principle of maximum entropy (Maxent) is often used to obtain prior probability distributions as a method to obtain a Gibbs measure under some restriction giving the probability that a system will be in a certain state compared to the rest of the ...
Hector Zenil +2 more
doaj +1 more source
We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Coding Theorem Method (CTM) that approximates local estimations of algorithmic complexity based on Solomonoff–Levin’s theory of algorithmic ...
Hector Zenil +5 more
doaj +1 more source
Generic Hardware Private Circuits
With an increasing number of mobile devices and their high accessibility, protecting the implementation of cryptographic functions in the presence of physical adversaries has become more relevant than ever.
David Knichel +2 more
doaj +1 more source
Randomized priority algorithms
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Angelopoulos, Spyros, Borodin, Allan
openaire +4 more sources
Probabilistic Algorithmic Randomness
AbstractWe introduce martingales defined by probabilistic strategies, in which randomness is used to decide whether to bet. We show that different criteria for the success of computable probabilistic strategies can be used to characterize ML-randomness, computable randomness, and partial computable randomness.
Buss, Sam, Minnes, Mia
openaire +2 more sources
LT^2C^2: A language of thought with Turing-computable Kolmogorov complexity [PDF]
In this paper, we present a theoretical effort to connect the theory of program size to psychology by implementing a concrete language of thought with Turing-computable Kolmogorov complexity (LT^2C^2) satisfying the following requirements: 1) to be ...
Santiago Figueira +2 more
doaj +3 more sources
We consider algorithmic randomness in the Cantor space C of the infinite binary sequences. By an algorithmic randomness concept one specifies a set of elements of C, each of which is assigned the property of being random. Miscellaneous notions from computability theory are used in the definitions of randomness concepts that are essentially rooted in ...
Jan Reimann, Rodney Downey
openaire +3 more sources
This study explores the utilization of blockchain data as a set of pseudorandom numbers in the context of microtonal algorithmic composition. Conventional methods of generating indiscriminate numbers often lack the desired levels of unpredictability and ...
Krzysztof Kicior
doaj +1 more source
A Review of Graph and Network Complexity from an Algorithmic Information Perspective
Information-theoretic-based measures have been useful in quantifying network complexity. Here we briefly survey and contrast (algorithmic) information-theoretic methods which have been used to characterize graphs and networks. We illustrate the strengths
Hector Zenil +2 more
doaj +1 more source
Current approaches in science, including most machine and deep learning methods, rely heavily at their core on traditional statistics and information theory, but these theories are known to fail to capture certain fundamental properties of data and the ...
Hector Zenil
doaj +1 more source

