What is differential privacy?
Differential privacy is a framework for measuring the privacy guarantees provided by an algorithm. Through the lens of differential privacy, we can design machine learning algorithms that responsibly train models on private data. Learning with differential privacy provides provable guarantees of privacy, mitigating the risk of exposing sensitive training data in the synthetic data model or its output. Intuitively, a model trained with differential privacy should not be affected by any single training example, or small set of training examples in its data set.
Get Started
Ready to try Gretel?
Start making your job easier instantly.
Get started in just a few clicks with a free account.
Get started in just a few clicks with a free account.
- Join the Synthetic Data Community
Join our Discord to connect with the Gretel team and engage with our community.
- Read our docs
Set up your environment and connect to our SDK.