GSEC Data Anonymization Codelab

‘Computing Private Statistics with Differential Privacy’

You’re invited to the new GSEC ‘Computing Private Statistics with Differential Privacy‘ codelab, facilitated by Google’s own Mirac Vuslat Basaran and Niels Overwijn on Jun 08, 2022 6pm - 6:50pm (CET) for US/EMEA and Jun 09, 2022 10am - 10:50am (CET) for APAC/EMEA. In 2019, we open-sourced our first Differential Privacy library to enable developers and organizations to learn from the majority of their data while simultaneously ensuring that those results do not allow any individual's data to be distinguished or re-identified. In this new codelab, you will learn how to produce statistics that are preserving the user’s privacy by using differentially private aggregations. 

Register for the GSEC Data Anonymization Codelab below.

Helping developers and organizations use differential privacy 

Facilitator

Mirac Vuslat Basaran, Software Engineer

Mirac is a Software Engineer in the area of anonymization and differential privacy at Google. Before joining Google, he studied Computer Engineering (and Economics) at Bilkent University. Currently, he helps build and open source infrastructure for product teams to anonymize their data. He also consults product teams on anonymization and differential privacy.

Christiane Ahlheim

Niels Overwijn, Customer Solutions Engineer

Niels is a privacy focussed Customer Solutions Engineer, building custom solutions and providing technical consultations to help the largest customers and agencies across Northern Europe.

Ehsaan Qadir

Stefano Reccia, Partner Development Manager EMEA

Moderator

Stefano is Partnerships Manager for the Google Safety Engineering Center in Munich. Before joining Google, he studied Economics at Tübingen University (DE) and Lunds University (SE). His scope is to work with Google's product and engineering privacy teams to help the web ecosystem with the adoption and integration of open source privacy preserving technologies. Thus, one important focus area is anonymization and differential privacy.

Ehsaan Qadir

Target Audience

This codelab is geared towards startup developers, data scientists, business analysts, and product managers who work with or analyze personable identifiable datasets. This is also a great program for software developers, data scientists, and data analysts who hope to improve their product offerings or plan to publish statistics based on datasets that require a robust data anonymization technique to protect their user’s privacy and prevent data leakages.

Experience requirements

Participants need to be able to read and write Python, the open source programming language, to follow the codelab and understand the computational models.  

Pre-work

Before the codelab, we recommend reading up on the topics of data anonymization and differential privacy. You should read the Python introduction to familiarize yourself with a high-level library for writing data-processing pipelines. You’ll want to have this page ready if you want to code along during the introduction of the codelab.

We are happy that our partnership with the GSEC team will help the developer community generate differentially private results in Python. This codelab is an ideal way to understand the functionality of the PythonDP libraries via a codelab

Andrew Trask, OpenMined