Definition
In algorithmic information theory, algorithmic probability, also known as Solomonoff probability, is a mathematical method of assigning a prior probability to a given observation. It was invented by Ray Solomonoff in the 1960s. It is used in inductive inference theory and analyses of algorithms. In his general theory of inductive inference, Solomonoff uses the method together with Bayes' rule to obtain probabilities of prediction for an algorithm's future outputs.
Related concepts
Algorithmic information theoryAndrey KolmogorovBayes' ruleBayesian inferenceComputable functionDecision theoryEpicurusIf and only ifInductive inferenceInductive probabilityInformation-based complexityKolmogorov complexityKraft-McMillan inequalityLeonid LevinMarcus HutterOccam's razorPrefix codeProbabilityProbability distributionRay SolomonoffScholarpediaSearch algorithmSolomonoff's theory of inductive inferenceTuring machineUniversal Turing machineUniversal priorUniversal probability (disambiguation)Universality (philosophy)Without loss of generality
10 concepts already in your glossary