Definition
Differential privacy (DP) is a mathematically rigorous framework for releasing statistical information about datasets while protecting the privacy of individual data subjects. It enables a data holder to share aggregate patterns of the group while limiting information that is leaked about specific individuals. This is done by injecting carefully calibrated noise into statistical computations such that the utility of the statistic is preserved while provably limiting what can be inferred about any individual in the dataset.
Related concepts
Adam D. SmithAdditive noise mechanismsApple Inc.Boolean algebraCell suppressionCoin flippingCryptanalysisCynthia DworkData analysisData sharingDenormal numbersDifferential privacy composition theoremsDifferentially private analysis of graphsDorothy DenningDouble-precision floating-point formatExponential mechanism (differential privacy)FalsifiabilityFloating-point arithmeticFrank McSherryFuzzingGaussian distributionGitHubGoogleGödel PrizeHamming distanceIOS 10Image (mathematics)Implementations of differentially private analysesIntelligent personal assistantIrit DinurJournal of the American Statistical AssociationK-anonymityKobbi NissimL2 normLaplace distributionLeaky abstractionLocal differential privacyMayer D. SchwartzMeta.comMetric spaceMicah AltmanMicrodata (statistics)NP-hardnessOpen accessPeter J. DenningPrivacyProbability distributionProtected health informationQuasi-identifierRandomized algorithmRandomized responseRandomnessReal numberReconstruction attackSemantic ScholarSocial scienceSoftware testingStandard deviationStatisticalStatistical databaseTaxicab geometryTaylor & FrancisTiming attackTore DaleniusUnited States Census Bureau
3 concepts already in your glossary