blanketglossary

Kullback–Leibler divergence

Definition

In mathematical statistics, the Kullback–Leibler (KL) divergence, denoted , is a type of statistical distance: a measure of how much an approximating probability distribution Q is different from a true probability distribution P. Mathematically, it is defined as

Related concepts

Absolute continuityAbsolute valueAbsolutely continuous measureAdditive mapAdvances in MathematicsAffine connectionAkaike information criterionAlmost everywhereAlmost surelyAlphabet (formal languages)Annals of Mathematical StatisticsApproximationAsymmetryBase of a logarithmBayes' theoremBayesian experimental designBayesian inferenceBayesian information criterionBayesian statisticsBhattacharyya distanceBinomial distributionBioinformaticsBitBregman divergenceBretagnolle–Huber inequalityChain ruleChange of variablesChapman & HallChi-squared testCoding theoryConditional entropyConference on Neural Information Processing SystemsContinuous random variableConvex functionCounting measureCovariance matrixCross-entropyCross entropyD-optimal designData compressionData differencingDensity matrixDeviance information criterionDifferential entropyDimensional analysisDiscrete probability distributionDiscrete random variableDiscrete uniform distributionDivergence (statistics)Dover PublicationsDually flat manifoldE.T. JaynesE (mathematical constant)Earth mover's distanceEinstein summation conventionEntropic value at riskEntropyEntropy (information theory)Entropy codingEntropy encodingEntropy in thermodynamics and information theoryEntropy maximizationEntropy power inequalityEvent (probability theory)Evidence lower boundExergyExpectation (statistics)Expectation–maximization algorithmExpected valueExponential familiesExponential familyF-divergenceFeature selectionFisher information metricFluid mechanicsGaussian measureGibbs' inequalityGibbs free energyGibbs inequalityHaar measureHarold JeffreysHellinger distanceHellinger metricHelmholtz free energyHessian matrixHilbert spaceHuffman codingI. J. GoodIf and only ifIndependent random variablesInferenceInfinitesimal generator (stochastic processes)Information contentInformation entropyInformation gain (decision tree)Information gain in decision treesInformation gain ratioInformation geometryInformation projectionInformation theory and measure theoryInternational Journal of Computer VisionJacobianJensen–Shannon divergenceJohn Wiley & SonsJoint probability distributionJosiah Willard GibbsKolmogorov–Smirnov testKraft–McMillan inequalityKronecker deltaLaplaceLarge deviationsLebesgue measureLeonidas J. GuibasLie groupLimiting density of discrete pointsLogarithmLogitLoss functionLossless compressionMAUVE (metric)Machine learningMarginal probability distributionMatching distanceMathematical proofMathematical statisticsMaximum likelihoodMaximum likelihood estimationMaximum spacing estimationMean squared deviationMeasurable spaceMeasure (mathematics)Metric (mathematics)Metric tensorModel selectionMultivariate normal distributionMutual informationNat (unit)Natural logarithmNeuroscienceNeyman–Pearson lemmaNon-negativeNormal distributionNumerical RecipesParameter spacePartition function (mathematics)Partition of a setPatch (computing)Pinsker's inequalityPoisson distributionPositive-definite matrixPosterior distributionPrefix-free codePrinciple of Insufficient ReasonPrinciple of Maximum EntropyPrior distributionPrior probability distributionProbabilityProbability density functionProbability distributionProbability measureProbability spaceProbability spacesPythagorean theoremQuantum entanglementQuantum information scienceQuantum relative entropyRadon–Nikodym derivativeRandom variableRandom variateRate functionReal numberRichard LeiblerRiemann hypothesisRiemannian metricRényi divergenceSample spaceSelf-informationSergio VerdúShannon's entropyShannon entropyShun'ichi AmariSolomon KullbackSphereSquared Euclidean distanceStandard temperature and pressureStatistical ScienceStatistical classificationStatistical distanceStatistical divergenceStatistical manifoldStatistical modelSufficient statisticSurprisalTaylor seriesThe American StatisticianTime seriesTopologyTotal-variation distance of probability measuresTotal variationTriangle inequalityUnits of informationUtility functionVariation of informationVariational inferenceVincenzo BonniciWork (thermodynamics)

14 concepts already in your glossary