May 09, 2023
Debra J. Farber / Damien Desfontaines
Season 2
Episode 17

S2E17 - Noise in the Machine: How to Assess, Design & Deploy 'Differential Privacy' with Damien Desfontaines (Tumult Labs)

The Shifting Privacy Left Podcast

Chapters

1:15

Introducing Damien Desfontaines, PhD

3:34

Why there's such a gap between the academia and the corporate world

5:19

How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & useability

8:03

When to use "local" vs "central" differential privacy techniques

11:56

Damien describes advancements in technology that enable the private collection of data (i.e., multi-party computation, secure computation, federated learning) that can be used with local DP

14:32

Damien describes Tumult Labs' Assessment approach to deploying differential privacy, where a customer would define its 'data publication' problem or question.

17:08

Damien describes how the open source Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements

19:13

Why using gold standard techniques like differential privacy to safely release, publish, or share data, we tell them that this goes beyond compliance to unlock the value of company data

20:37

What's involved with deploying differentially private algorithms via Tumult Labs' platform

21:49

Damien's litmus test for when it's appropriate to use differential privacy

26:25

How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks and number of tasks that you can possibly answer

30:27

Damien describes his work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project

33:02

Damien discusses security vulnerabilities (i.e. potential attacks) to differentially private datasets

37:24

Where you can learn more about differential privacy

40:18

How Damien sees this space evolving over the next several years

The Shifting Privacy Left Podcast

May 09, 2023
Season 2
Episode 17

Debra J. Farber / Damien Desfontaines

In this week’s episode, I speak with Damien Desfontaines, also known by the pseudonym “Ted”, who is the Staff Scientist at Tumult Labs, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at ETH Zurich, as well as his Master's Degree in Mathematical Logic and Theoretical Computer Science.

Tumult Labs’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.

When it comes to protecting personal data, Tumult Labs has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.

**Topics Covered:**

- Why there's such a gap between the academia and the corporate world
- How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & usability
- When to use "local" vs "central" differential privacy techniques
- Advancements in technology that enable the private collection of data
- Tumult Labs' Assessment approach to deploying differential privacy, where a customer defines its 'data publication' problem or question
- How the Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements
- Why using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond compliance
- How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks & number of tasks that you can possibly answer
- Damien's work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project
- How to address security vulnerabilities (i.e. potential attacks) to differentially private datasets
- Where you can learn more about differential privacy
- How Damien sees this space evolving over the next several years

**Resources Mentioned:**

- Join the Tumult Labs Slack
- Learn about Tumult Labs

**Guest Info:**

Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media

Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

In this week’s episode, I speak with Damien Desfontaines, also known by the pseudonym “Ted”, who is the Staff Scientist at Tumult Labs, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at ETH Zurich, as well as his Master's Degree in Mathematical Logic and Theoretical Computer Science.

Tumult Labs’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.

When it comes to protecting personal data, Tumult Labs has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.

**Topics Covered:**

- Why there's such a gap between the academia and the corporate world
- How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & usability
- When to use "local" vs "central" differential privacy techniques
- Advancements in technology that enable the private collection of data
- Tumult Labs' Assessment approach to deploying differential privacy, where a customer defines its 'data publication' problem or question
- How the Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements
- Why using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond compliance
- How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks & number of tasks that you can possibly answer
- Damien's work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project
- How to address security vulnerabilities (i.e. potential attacks) to differentially private datasets
- Where you can learn more about differential privacy
- How Damien sees this space evolving over the next several years

**Resources Mentioned:**

- Join the Tumult Labs Slack
- Learn about Tumult Labs

**Guest Info:**

Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media

Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.