What is differential privacy?
Differential privacy is a framework for measuring the privacy guarantees provided by an algorithm. Through the lens of differential privacy, we can design machine learning algorithms that responsibly train models on private data. Learning with differential privacy provides provable guarantees of privacy, mitigating the risk of exposing sensitive training data in the synthetic data model or its output. Intuitively, a model trained with differential privacy should not be affected by any single training example, or small set of training examples in its data set.