Data Anonymization Codelab - ‘Computing Private Statistics with Differential Privacy’

We cordially invite you to join the ‘Computing Private Statistics with Differential Privacy‘ codelab, which will be facilitated by Christiane Ahlheim, Data Scientist and Ehsaan Qadir,  Customer Solutions Engineer.

In 2019 we open sourced our first Differential Privacy library to enable developers and organizations to learn from the majority of their data while simultaneously ensuring that those results do not allow any individual's data to be distinguished or re-identified. In this codelab you will learn how to produce statistics that are preserving the user’s privacy by using differentially private aggregations.

Please register to the codelab by using this website in order to keep you informed about the meeting details and logistics.  

Global Center for Data Security

When & Where

Wednesday, 2nd of June | 5 PM - 6 PM CET

Remotely facilitated via Google Meet

Facilitators

Christiane Ahlheim

Christiane Ahlheim
Data Scientist

Christiane is a data scientist in gTech, helping Google’s customers to reach peak marketing performance. One of her focus areas is privacy-first data science, where she’s developing new solutions to help clients thrive. 

Ehsaan Qadir

Ehsaan Qadir 
Customer Solutions Engineer 

Ehsaan is a Customer Solutions Engineer at Google. He has worked with many customers across various industries to solve their most pressing technical challenges. He is fascinated by good design, both in software and user experiences.

Participants

The codelab is suitable for developers, data scientists, business analysts, product managers who work with or analyze personable identifiable datasets to improve their product offerings or plan to publish statistics based on datasets that require a robust data anonymization technique to protect their user’s privacy and prevent data leakages. 

Experience requirements

Participants with some degree of familiarity with Go, the open source programming language, and Beam will have an easier time following the codelab and understanding the computational models, which will be using differentially private aggregations. There is no official technical experience requirement other than being able to read and write Go.  

Pre-Work

We recommend reading up on the topic of data anonymization, as the process of aggregating data across multiple users to protect user privacy, and differential privacy as a strong privacy notion of anonymization.In addition, we recommend reading the Beam introduction to familiarize yourself with a high-level library for writing data-processing pipelines. In case you want to code along during the introduction of the codelab, we recommend having this page ready. 

Helping developers and organizations use differential privacy