Definition
Bayes' theorem, named after Thomas Bayes, gives a mathematical rule for inverting conditional probabilities, allowing the probability of a cause to be found given its effect. For example, with Bayes' theorem, the probability that a patient has a disease given that they tested positive for that disease can be found using the probability that the test yields a positive result when the disease is present. The theorem was developed in the 18th century by Bayes and independently by Pierre-Simon Laplace.
Related concepts
A Scanner DarklyAdrian Smith (statistician)Alan TuringAmong UsAn Essay Towards Solving a Problem in the Doctrine of ChancesAndrey KolmogorovApproximate Bayesian computationAxiomatic systemBas van FraassenBayes estimatorBayes factorBayesian epistemologyBayesian estimatorBayesian hierarchical modelingBayesian inferenceBayesian inference in phylogenyBayesian information criterionBayesian linear regressionBayesian model averagingBayesian networkBayesian persuasionBayesian probabilityBayesian statisticsBenjamin FranklinBernstein–von Mises theoremBinomial distributionBiometrikaBletchley ParkBoy or Girl paradoxBrian SkyrmsCambridge University PressChain rule (probability)Charles Sanders PeirceCheryl MisakChristian RobertChristopher A. FuchsCipherCoherence (philosophical gambling strategy)Colin HowsonComplex numberConditional densityConditional expectationConditional probabilityConjugate priorContingency tableCox's theoremCredible intervalCromwell's ruleCryptanalysisDaphne KollerDensity matrixDomain of a functionEdmund F. RobertsonEdward Arnold (publisher)Edward N. ZaltaEdwin Thompson JaynesEmpirical Bayes methodEncyclopædia Britannica Eleventh EditionEnigma machineEvent (probability theory)Evidence lower boundEvolutionary biologyF. Thomas BrussFalse positive rateFrequentist interpretation of probabilityFunction (Mathematics)Genetic diseaseGenetic testingGenetic varianceGeneticsGenetics in MedicineGenotypeGeorge CasellaHarold JeffreysHugh ChisholmI. J. GoodIan HackingImproper priorInductive probabilityIntegrated nested Laplace approximationsJames M. JoyceJapanese naval codesJason RosenhouseJohn Wiley & SonsJoint probability distributionKatrina A. B. GoddardLaplace's approximationLaw of total probabilityLikelihood functionLikelihood principleLikelihood ratioMIT OpenCourseWareMacTutor History of Mathematics ArchiveMarginal likelihoodMarginal probabilityMarkov chain Monte CarloMaximum a posteriori estimationMetascienceMonty Hall problemNested sampling algorithmNicholas SaundersonNir FriedmanNoûsOddsPersi DiaconisPhilosophical TransactionsPierre-Simon LaplacePosterior predictive distributionPosterior probabilityPrinciple of indifferencePrinciple of maximum entropyPrior probabilityProbability interpretationsProbability kinematicsProportionality (mathematics)Public domainPythagorean theoremQBismQuantum mechanicsRadon–Nikodym theoremRandom variableReal lineRealization (probability)Recessive geneRegular conditional probabilityRichard McElreathRichard PriceRobert SpekkensRoyal SocietySensitivity (tests)Specificity (tests)Stanford Encyclopedia of PhilosophyStatistical RethinkingStatistical ScienceStatistical inferenceStephen StiglerSupport (mathematics)Thomas BayesThree Prisoners problemThéorie analytique des probabilitésTrue negative rateTrue positive rateTwo envelopes problemUniversity of St AndrewsVariational Bayesian methodsVeritasiumVigenère cipherWhy Most Published Research Findings Are FalseWikifunctionsWorld War IIYale University PressYouTube
17 concepts already in your glossary