InfoQ Homepage Differential Privacy Content on InfoQ
News
RSS Feed-
Google Introduces VaultGemma: An Experimental Differentially Private LLM
VaultGemma is a 1B-parameter Gemma 2-based LLM that Google trained from scratch using differential privacy (DP) with the aim of preventing the model from memorizing and later regurgitating training data. While still a research model, VaultGemma could enable applications cases in healthcare, finance, legal, and other regulated sectors.
-
PipelineDP Brings Google’s Differential-Privacy Library to Python
Google and OpenMined have released PipelineDP, a new open-source library that allows researchers and developers to apply differentially private aggregations to large datasets using batch-processing systems.
-
Facebook Open-Sources Machine-Learning Privacy Library Opacus
Facebook AI Research (FAIR) has announced the release of Opacus, a high-speed library for applying differential privacy techniques when training deep-learning models using the PyTorch framework. Opacus can achieve an order-of-magnitude speedup compared to other privacy libraries.
-
Introducing TensorFlow Privacy, a New Machine Learning Library for Protecting Sensitive Data
In a recent blog post, TensorFlow announced TensorFlow Privacy, an open source library that allows researchers and developers to build machine learning models that have strong privacy. Using this library ensures user data are not remembered through the training process based upon strong mathematical guarantees.