The Shifting Privacy Left Podcast

S3E6: 'Keys to Good Privacy Implementation: Exploring Anonymization, Consent, & DSARs' with Jake Ottenwaelder (Integrative Privacy)

March 05, 2024 Debra J. Farber / Jake Ottenwaelder Season 3 Episode 6
The Shifting Privacy Left Podcast
S3E6: 'Keys to Good Privacy Implementation: Exploring Anonymization, Consent, & DSARs' with Jake Ottenwaelder (Integrative Privacy)
Show Notes Transcript Chapter Markers

In this week's episode, I sat down with Jake Ottenwaelder,  Principal Privacy Engineer at Integrative Privacy LLC. Throughout our conversation, we discuss Jake’s holistic approach to privacy implementation that considers business, engineering, and personal objectives, as well as the role of anonymization, consent management, and DSAR processes for greater privacy. 

Jake believes privacy implementation must account for the interconnectedness of privacy technologies and human interactions. He highlights what a successful implementation looks like and the negative consequences when done poorly. We also dive into the challenges of implementing privacy in fast-paced, engineering-driven organizations. We talk about the complexities of anonymizing data (a very high bar) and he offers valuable suggestions and strategies for achieving anonymity while making the necessary resources more accessible. Plus, Jake shares his advice for organizational leaders to see themselves as servant-leaders, leaving a positive legacy in the field of privacy. 

Topics Covered: 

  • What inspired Jake’s initial shift from security engineering to privacy engineering, with a focus on privacy implementation
  • How Jake's previous role at Axon helped him shift his mindset to privacy
  • Jake’s holistic approach to implementing privacy 
  • The qualities of a successful implementation and the consequences of an unsuccessful implementation
  • The challenges of implementing privacy in large organizations 
  • Common blockers to the deployment of anonymization
  • Jake’s perspective on using differential privacy techniques to achieve anonymity
  • Common blockers to implementing consent management capabilities
  • The importance of understanding data flow & lineage, and auditing data deletion 
  • Holistic approaches to implementing a streamlined and compliant DSAR process with minimal business disruption 
  • Why Jake believes it's important to maintain a servant-leader mindset in privacy

Guest Info: 



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Jake Ottenwaelder:

In privacy, we need to understand that we're providing a service to the rest of the organization and we are leading by example and we are serving others through our leadership and through our advocation. Just understanding that maybe you're not going to see the rewards every single day. Take time for yourself. Make sure that you understand that you are hopefully fighting the just fight, but really being a servant leader and uplifting others to enable them to do privacy on your behalf. That's how we leave the best legacy and continue to grow adoption and education of privacy.

Debra J Farber:

Hello, I am Debra J Farber. Welcome to The Shifting Privacy Left podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans and to prevent dystopia. Each week, we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models and ecosystems. Welcome everyone to The Shifting Privacy Left podcast. I'm your host and resident privacy guru, Debra J Farber.

Debra J Farber:

Today, I'm delighted to welcome my next guest, Jake Ottenwaelder, Principal Privacy Engineer at Integrative Privacy, a consulting firm that he launched, which centers around a holistic approach to privacy implementation. Jake, having moved from a cybersecurity background into privacy, has been a one-man privacy engineering team across late-stage startup organizations, most recently GoFundMe. He also worked at Deloitte for quite some time and has a lot of consulting experience. Jake lives outside of Seattle, Washington, where he's raising a puppy and creating model landscapes in his spare time, and today, if you can't tell, we're going to be chatting about implementation of privacy objectives into the business and engineering and what that takes. Welcome, Jake.

Jake Ottenwaelder:

Hey, Debra, thanks for having me.

Debra J Farber:

Oh, it's my pleasure. To start off, why don't you tell us a little bit about your journey from Security Engineer to Privacy Engineer and what motivated you to switch your focus, and why focus on privacy program implementation?

Jake Ottenwaelder:

In college, I focused on cybersecurity. My university had an honors program dedicated to cybersecurity and I was really interested in going down that route. I got my first job doing cybersecurity consulting with Deloitte and throughout that experience started to learn more and just understand more about data. And when I got the opportunity to leave that organization I started doing more just compliance automation, and part of the compliance that I was automating was privacy. So, I moved a couple more times and slowly got strictly into privacy and absolutely love the abstract nature of the industry, how we have regulations but there's a lot of connection between regulations and technology and ethics. I think it's a problem that nobody's going to be able to solve, but I love the opportunity to just be able to add my name into the hat of, hopefully, people who can affect change and improve privacy around the world.

Debra J Farber:

That is a very noble goal, so I'm glad that you were able to be delighted by privacy and the wicked problem that it is and trying to streamline some things. So, you talked about it being pretty abstract and loving that.

Debra J Farber:

I want to talk a little bit more about that because I think that's been a real pain point for a lot of folks, especially engineers. Where it's so abstract, you just don't have the appropriate requirements to test to to know that you've met a privacy goal or aim, and so that abstraction has been a double-edged sword in many ways. Got this principle-based privacy right or privacy principle that you want to make sure is embedded into the business, but then well, what does that really mean when there's no set of requirements? How do you get from point A to point B? How do you take the abstraction and actually get to requirements?

Jake Ottenwaelder:

I'm going to tell a little embarrassing story. In college, I was sitting there after college and looking for a job and I was sitting there and I was like I don't understand how everybody has to work so much. How can problems be so hard that you can't just solve it in a couple hours? And I was sitting there as this naive college kid not believing that problems could really be that hard, as I just kind of coasted through college and I was fortunate enough to not have to worry a lot about finishing homework - like taking a long time to do homework and stuff. So, I was coming from this mindset of just like how can problems be so difficult?

Jake Ottenwaelder:

And yeah, abstraction in privacy is huge, but I love it because it's such a unique experience or unique perspective that everybody has on it. So, every individual person has what they believe their privacy should be respected for or what they believe their data is valued at. And, as a privacy engineer, as somebody who's looking at the regulations, a lot of times the regulations don't go far enough or we look at the regulations and we say it's telling us this thing, but what's the essence behind this regulation? What's the purpose of trying to get consent, and does that mean that we just put up a banner that says accept all or does it mean that we give people equal choices and stuff? So, there's a lot of. . .That level of abstraction and that almost, I want to say, advocating for everyone equally,

Jake Ottenwaelder:

that really became a big motivator for me. And, so I guess, to answer your question, how do you get that into requirements? It's really hard. It takes a certain mindset. It takes a mindset of like always being ready to like, constantly learn and being open to new experiences and talking to people, and it takes a very kind of global mindset that I try to consider individuals who are unique in their populations and require the additional support that we should be giving everybody, no matter where they're coming from.

Debra J Farber:

That makes a lot of sense.

Debra J Farber:

In fact, that actually inspires another question for me - security is very much system- focused. Right? So, when we talk about compliance of a system, we're talking about is it secured according to requirements?

Debra J Farber:

But it really takes a mental shift to then be like no, privacy is about people. It's about protecting people from harms. And then obligations on the compliance part is around obligations of protecting that data in a certain way that's linkable to a person. And so how has that shift, mind shift, been for you to make as you moved from security into privacy? Because it's clear that you have made that mind shift, even just talking about giving people choices instead of just putting a banner up there to say you have compliance; or really thinking about the people in the population and maybe some edge cases and who could be harmed by this. I guess partly my asking this is also wanting to educate others on how you've made that leap and maybe how they can then get security folks to start thinking in this way, even if they remain in the security function, but just expansive understanding of what privacy means.

Jake Ottenwaelder:

Yeah, first thing that comes to mind is take a second to look around at everything going on and really understanding. I think there was an article, and I can't think of it off the top of my head, but it talked about a different character or customer profiles. I think there's a lot of studies around customer profiles and stuff. I think there's different data. I think it's actually related to privacy specifically, maybe. But looking at just the broad range of people's experiences and just thinking about that, and then for me, having the opportunity to work on systems where I was working with billions of connected devices - GoFundMe is a massive platform that I was super honored to work on that requires so much support - and just looking at different people's stories and who are really needing help gives you that perspective that you're there to serve. The biggest thing for me is that it's not my data as an organization. They're collecting information, but it's not really their data. They're borrowing it from other people and so, just like as I would borrow my lawnmower from a neighbor, they would probably expect it back at some point. So this is all data that organizations are borrowing and so, as stewards of the data, we need to be respectful of those people's wishes.

Jake Ottenwaelder:

So, I think security plus ethics to me like the security areas is, like you said, a lot more straightforward. They have a lot more compliance. As you add in this ethics, then you get to kind of privacy, and so people who are interested should look into more ethical considerations and just understanding like is it really really complicated stuff. Like what is free will? Do you have free will if you have a. . . Bringing up cookie banners again, they're one of the most challenging UI conversations and issues around consent notices as well. It's like do you have free will if you don't have the expectation or if the UI is pushing you in a different direction? So, how can we really allow people to be more free with those decisions? That's the area of privacy that I love to play in, have conversations in, and try to push an organization to adopt what I consider to be the more ethical kind of approach.

Debra J Farber:

So it sounds like some of the reasons that you've been able to really make that switch is you actually were working for a company that had a real good focus on B2C. Right? So, you weren't just a B2B business that's trying to figure out the privacy stuff and thinking about privacy rights being a compliance obligation and so just what's the best way to achieve compliance in a privacy perspective. But you're really like had to actually. . . your personas within your organization, a major one was individuals that were gonna just fund some startups and those individuals data that you were collecting, and so you were kind of seeing part of that need to protect a person as opposed to just B2B flows. Does that resonate with you? Do you think that that's some of the reason?

Jake Ottenwaelder:

Yeah, definitely. I think I'll even go back a little further, because I first started to really explore privacy when I was working at Axon. Axon's the company that has developed at the tasers and body cameras and they have a very large market in that. When you look at privacy considerations for a company that develops policing technologies, all the privacy concerns around that were huge. Looking at like video camera data, that and audio recordings and looking at tracking of police activity or where people are going, obviously see that being, like you see body camera images released online or publicly released, and how all of that affects privacy as well.

Jake Ottenwaelder:

So, definitely like that was the first, like you're seeing the people kind of dealing with that, and then my other organizations definitely had a lot of individual, like yes, these are real people that are going to be interacting with these sites and these web apps that are. . . we're having like billions of devices and millions and millions of people a month that are touching these things that we need to. . . not only so that it's implemented consistently, but also robust for all of the people that were servicing.

Debra J Farber:

That makes a lot of sense. So let's turn to the topic of the day - implementing privacy into an organization. You claim to take a holistic approach to privacy implementation through your organization Integrated Privacy. Now, what does it mean to, first of all, to 'implement privacy' and then tell us about your holistic approach to implementing privacy?

Jake Ottenwaelder:

Yeah, I think just implementing privacy is like something that a lot of organizations just try to do, and I think they start off by putting in a cookie banner with One Trust is like the big. . .everybody uses that platform or just creating a privacy notice and sticking that up on the page and calling it a day.

Jake Ottenwaelder:

That's where I see a lot of organizations using it and you'd be surprised the even larger organizations, as they continue to grow, still maintain those as kind of the foundations of their privacy program just a page on their website and this little banner that'll pop up for certain people.

Jake Ottenwaelder:

I took a lot of thought in trying to figure out what my next steps were after leaving GoFundMe and I really felt like there's an opportunity for people to adopt and have better relationship with privacy within organizations. So, for me, that kind of came in through this mind of like integrative privacy being similar to like an integrative wellness. I find, or I'm a believer of integrative wellness and just understanding your body as a whole, and I take that similar approach. When looking at integrative privacy and implementing privacy or privacy technologies, it's not just a checkbox, it's not just the compliance. It's the ethics; it's the people of the organization that need to be educated and brought into the fold, to be enabled to be better data stewards of people's information. So, that's kind of where I take this integrative and more of a holistic approach to privacy implementation.

Debra J Farber:

Thank you, that's helpful. Let's talk about metrics for success. How do you define a successful implementation and what makes for, I guess, a bad implementation or not as successful?

Jake Ottenwaelder:

Yeah, I think implementations are always such a challenge. Privacy technology is this area, I think, where these privacy engineers have so much I give them so much respect and because they have to deal with implementations that are so unique to the tech stack that they have to work with but also have to be forward looking and be ready to scale or change with regulations that are coming out and also be able to accept or look at each individual person that's coming to the page and give them options for their privacy. So for me, a successful implementation really starts with this adoption of the technology, looking at how it can be best integrated into the entire stack, and I think a lot of successful implementations are on the backs of strong understanding of what data is in your company and where that data is going. I think, if you have that understanding, you can implement privacy technology much easier and have a much better foundation for that. Bad implementations I think everybody has an experience with this.

Jake Ottenwaelder:

It's when there's always something that's kind of popping up. I think a lot of it comes down to planning and scoping, making sure that you can take a second. . . I just want to get in and do stuff. I want to do cool stuff. I want to fix things. But, you've got to take a beat at the beginning and look at the landscape and look at the picture of what you're trying to do and if you don't do that then you're gonna have surprises that come up. You're gonna have issues with scoping or planning or resources that just can prolong this implementation. And it's a lot about like building that trust for the organization, because privacy is not really a money-making operation. If you're doing successful implementations and you're on time and on spec, you're gonna have a lot more trust with the organization when the next tool kind of comes into play that you need to work on.

Debra J Farber:

Yeah, you know, I also think of that as, if you're just trying to do the minimum necessary and just comply with a new regulation Instead of looking at the broader trend of what that regulation is trying to solve for - like, "Oh Okay, we can still do this with all our data," but like a new regulation comes in and says we need to be careful about location data and the way that's used and whatever, and you're just doing pointed solutions around one data type and you're not seeing the broader effort to get your arms around how data could be used, misused, the harms around it and get you know, then each time a regulation comes, you're constantly gonna have to re-implement something else rather than thinking about "what is good privacy. Because if you just think, w"What is good privacy and what does it look like to respect that and let's now try to automate for that you know, then if you do that, compliance will follow.

Debra J Farber:

[Jake: Yeah, yeah, exactly]. It'll be easy to make similar simple tweaks based on new regulations.

Jake Ottenwaelder:

Yeah, exactly that's why you run the podcast, Debra. You hit it right on the head. That's exactly. . . if you're kind of continuing to do these band-aid solutions, you're gonna always be behind the eight ball. If you look at "What's the right thing and this is where the ethics kind of come it what's the right thing to do, what's the right thing for our customers?" And, sometimes you might get those things wrong, but as long as you're kind of shooting for that, you're gonna get a lot further than just kind of playing whack-a-mole, so to speak.

Debra J Farber:

Awesome. What are some consequences that you've seen for badly implementing privacy? Obviously, I kind of talked about that you constantly playing catch-up and in a state of frenzy of trying to be compliant when you're only addressing it in short bursts of compliance projects as opposed to privacy comprehensively integrated in. What are some other consequences that you've seen for attempts at implementing privacy but that are done wrong or done inefficiently or ineffectively?

Jake Ottenwaelder:

Yeah, I feel like the biggest issue with badly implementing privacy or one of the consequences is you end up having a lot of stuff just like lost or a lot of skeletons in the closet. You gain kind of a lot of tech debt, which is the term that I hear a lot of people using, where it's like you're implementing something and you have engineers kind of working on stuff but you're continuing to kind of dig a hole for yourself and that is just really expensive to get out of.

Debra J Farber:

It's expensive to get out of and I will even throw in there, as someone who's been doing operational privacy for 18 years that the more tech debt there is, the more I - you know I'm technically oriented but I'm not a hands-on technologist. So, the less ability I feel that I have to change, make change, in the organization, the less desire I have to even take a job in that company. Those are downstream problems, I think, that need to be thought about when you talk about this tech debt, because it's like "You're just making it harder and harder for me to be able to solve the problem by not addressing it. So, I don't want the job. It's just not going to set me up for success or the company."

Jake Ottenwaelder:

I consider tech debt is obviously a lot of like the technical stuff. Also, we got to talk about, like the procedures around that, everything that you do in privacy, or I consider everything that I'm doing to be setting a precedent. So, whether it's you're implementing a tool and it's halfway done, that is setting a precedent of "it is that way, now right, and so it takes a lot more effort.

Debra J Farber:

Like it didn't meet the MVP almost of the deployment. Yeah it's like it's not even meeting the minimum viable requirements of the need. Is that kind of what you're saying?

Jake Ottenwaelder:

So like, even if you're you're implementing a technology or that that's like out- of- date, the processes that you're putting around it; or as you kind of continue to grow as an organization, there's the technology that might be out- of- date or lacking behind or or not up to compliance. Then, there's also the procedures associated to that, which are all precedents that also require a lot of effort to start to change and work on as well. So, I think it's it's both of those that we deal with. With privacy engineering, not only the technology, but also the processes that that we need to update and and modify.

Debra J Farber:

Got it. So, building on that, how would you summarize what it takes to achieve a successful implementation of privacy?

Jake Ottenwaelder:

I think, starting off, I mentioned, it's the strong foundation, I think, understanding the data that that you're working with and where the data is going, and there are tools that can do that.

Jake Ottenwaelder:

I think it requires a lot of education and driving awareness by the privacy team. Like I said, there are tools that will help you look at data discovery. It's why do we need this, what is the benefit of it and how else are we going to get this value if not doing this? And so, it really starts with that strong foundation and then it takes a lot of Planning and for me it's been a lot of like project management work as well and growing in my skills around project management to be able to talk with an engineering team and Understand the technology behind it and help them implement and develop the software, but then also communicate high- level goals, high- level needs and escalate that to be able to get the resources and get the support that I need. So, I think that's in essence, and we're distilling down a lot of information. So, that's what I would I would kind of say is a successful implementation is focusing on those kind of key areas.

Debra J Farber:

Yeah, that makes sense. I want to kind of talk about some of the challenges that I've faced when I wanted to help implement privacy into an organization. I want to know everything that's going on in that organization around privacy. But sometimes, that's. . . I want to do this so that I can maybe piggyback on some other projects and just add privacy requirements. That way you're not recreating the wheel; you can leverage efficiencies. You know, there's there's so many good reasons to do that.

Debra J Farber:

But sometimes, especially engineering- heavy scaling tech companies - and I'm pulling from my own experience at Amazon, but it could be any of the large big tech or just giant enterprises or just any engineering focused org that is moving fast and not necessarily waiting for you to put guardrails around. - you just can't know everything that's going on. I was working for Amazon. 1.5 million people work there now. There was no way - I was finding out about consent decrees and fines that I had no idea about because I wasn't part of Legal. When you're in a giant decentralized Security and privacy within those orgs, so you might be within one business unit and you have no idea what other business units are doing, and it's not for a lack of partnerships or wanting to know. It's just too big to get your arms around all of the moving pieces.

Debra J Farber:

That was very frustrating for me because I wanted to know and, in order for me, I've learned a lot about my ADHD in recent years, about like divergent mindset. I need to know and have high- high level understanding of everything going on and I also need to know. I need to go deep into the silos of the components of to To have understanding of any one of those like in order for me to have that big picture. so, so without that it's a little disorienting because you don't know what the things going. When working in an organization to implement privacy, how do you stay on top of everything that's going on in an organization related to personal data?

Jake Ottenwaelder:

You mentioned working in Amazon. I mean, I feel like these challenges are very similar to the work that I've done as well where I was a solo team kind of privacy engineer with 300 software developers in the organization and trying to understand what all the sprints are doing and what people are kind of going on.

Debra J Farber:

Well, and for me, I find that I'm paying attention to too many things, right, like it's not a lack of attention, it's everything is coming in and not undefined filled, you know not, you know. So I'm filtering it all out or not out, but I'm, I'm making sense of it all. Instead of just like maybe three things, I'm seeing like 10, right, and so it's more overwhelming. But, I still feel I need to get that info so that I can better have a mental map about the state of it.

Jake Ottenwaelder:

Because for me, when I look at privacy you're doing PIA's, it's very much this like interconnected system. I might be doing a PIA for an authentication system, but that authentication system is going to inherit that risk from other kind of services as well or pass their risk along. So, everything is connected and I looked at it at that granular level. First, I'll say mental health is super important and I think to your point, with a lot of stuff happening all the time around privacy, it does take me a lot of time to be able to like sit with my thoughts and process it. Unfortunately, my mind kind of races at night and I just need to kind of lay there and process stuff out or write things down that's going through my brain, and it does take a toll. And so, I definitely think mental health has been a huge topic in the workforce and I think that's it bears the same kind of power here as well. For me, focusing on how to stay on top of things,

Jake Ottenwaelder:

I think there's a lot of power in enabling and getting close to, or just enjoying and trying to befriend, the people that you work with and just trying to get access to roadmaps and looking at stuff from a high- level. I think the more that you can work with other people and have other people like really enjoy working with you, who are in the positions that are on the different engineering teams, or you can get access to what their roadmap looks like or sit in on a monthly kind of meeting just to let your ear your perk up if anything new kind of happens. That's how I've been working on it and it's a job that, and even for me, when I was one person with 300 engineers, not to mention what you had to deal with with like an Amazon- sized organization, even just enabling other people to kind of help support you in that; it's not something you can do on your own, but it is, in my opinion, very valuable, like we said, to be able to recommend the right kind of privacy steps.

Debra J Farber:

Yeah, that's true and it has me even thinking that I guess it makes. . . you definitely want to have those relationships and people to almost keep an eye on certain things for you. Like "Hey, if you ever come across this where there's they're starting to collect a lot more data elements linked to this person, give me a call Because I'd want to track down what that is and like the goals there. Or "if you sniff out that there's going to be a new, there's new systems being deployed and it hasn't been well communicated, let me know," you know, or this way I can do a DPIA if there's high risk or whatever it is that the job role is. Obviously, not everyone's doing PIAs.

Jake Ottenwaelder:

Yeah, keeping alerts on Confluence pages is something I've done before. Like a new kind of page is added in Confluence, just have that sent to my email. I'll wake up, I'll have a hundred emails, but I'll just kind of look through that pretty quickly and be able to say, "Okay, like that makes sense. I mean it feels bad, but I don't want to say go spy on the rest of the organization. But, it helps to just kind of see and just understand what's going on. And I think, as long as you do it and, again, create those relationships, build the confidence and trust with other people that you talk to. I have great friends from outside of the engineering or within other engineering teams that I rely on today and get to talk to, and it's nice to also just have people at work that you can enjoy talking with. So, yeah, those are some of my tips as well.

Debra J Farber:

Well, thank you for that. So, let's start with anonymization. What are some of the blockers that you've seen when deploying anonymization as a privacy control within orgs?

Jake Ottenwaelder:

Yeah, anonymization is a very sensitive topic for a lot of people because I think there's no clear consensus, at least from a legal perspective or from a risk and compliance perspective, like how to really prove data is anonymous; and so, there's a lot of. . .I feel like the consensus around anonymizing data is very hard to really define, and so the biggest blocker that I've had with deploying anonymization is the early stages of getting people to buy in on trying to do masking.

Jake Ottenwaelder:

And, I think to reach anonymization you kind of have to start with basic masking of data and then you move into "Okay, we're gonna start to transform the data a little bit into this pseudonymized kind of state and the hope would be to eventually reach, or have some data sets reach, this standard of anonymization. So, it's a very long kind of process and so just getting the buy in is generally a pretty big blocker that I've had, because organizations, again, are focused on the money; and so, if you can't show them what the dollar signs are that are the risk that's associated to not having anonymized data when not a lot of companies have been fined for classifying their data as anonymous when it's not; it's really hard to start to prove and get that to be adopted.

Debra J Farber:

So, I think maybe it's the framing, because when I think of anonymizing data, I think of it as one of the big ways you can take data that you have for one purpose and then be able to use it for secondary purpose freely, because it's no longer personal data once it's anonymized.

Debra J Farber:

So, a lot of the times, this is great for opening up the value of data for analytics purposes and insights and all that. I kind of see this happening a lot in the data science space, where you wanna anonymize certain things. Then where it's a challenge is that if you're anonymizing. . .it depends on if you're anonym. . . It's input privacy or output privacy, right? Is it anonymizing before you do the analytics so that you can freely do that, or is it doing the analytics and then anonymizing the output before sharing it with third parties? I guess there's different use cases, but I kind of see anonymization as being used more around "How do you open up and use more and get more value out of the data set, and that really being tied well to additional revenue as opposed to compliance to take you out of like potential risk of having personal data.

Jake Ottenwaelder:

I definitely agree, and that's the general push. I think a lot of people can achieve what is considered more of a pseudonymized state through that. The leap between pseudonymization and anonymization is pretty big in my opinion. When we look at pseudonymized data, we can very much just take the data set as its own entity and ask like is this table able to connect you back to a specific individual on its own? Generally, that's a lot easier to achieve than anonymized data. And taking the definition from like GDPR and there's a lot of regulations and legal conversations from Germany as well that I've studied, anonymization is "dDoes there exist a data set that could or that would re-identify this individual and from a computational analytics perspective, does it exist?" is a much more difficult question to prove that it's not. So, I think a lot of companies are accepting the risk of not being able to fully prove anonymization.

Jake Ottenwaelder:

But, again, accepting that risk - there's not been a lot of regulation in the space or not been a lot of fines in the space, and so that's where I see the position at. I know there are contingents on both sides of this privacy industry of people who believe in anonymization as possible and others who don't believe it's possible. Generally, for your example of use cases around data analytics, there are ways that we can aggregate or manipulate data to what I would still consider maybe not fully anonymized; but, it would be very well-masked and very well-pseudonymized so that you can still do additional analytics or secondary use on it. But it's very hard to fully reach that purist anonymized state.

Debra J Farber:

Yeah. There's also a difference, I think, when we talk about anonymization versus an anonymized dataset, versus you could have a dataset that you want to make anonymous for release. When it's released, nobody can actually, outside of the organization itself. It's pseudonymous data. It exists pseudonymously but it would be, for shared purposes, it's anonymous for what has been . . . no one can link it back other than the company that actually has the identifiers or capability to link it back. Technically, one would say the dataset is a pseudonymous dataset because it can be re-identified; but for the purposes that it was released, that data that nobody else can re-identify it.

Debra J Farber:

I feel like we need better vernacular around those distinctions because I think those distinctions have further differences with importance that we don't really. . .We just lump anonymization in. It depends on whose perspective you're looking at. From a company that just sees anonymized statistics, I wouldn't be able to figure out who contributed what data to that anonymous dataset. But, those that do hold the keys might be able to re-identify. For their purposes, it's pseudonymous, but for my purposes it's anonymous. I think that gets a little confusing. I'm also curious what do you think about differential privacy using differential privacy techniques to achieve anonymity. It's a very high bar under GDPR to have data be considered anonymous; but, I do see a lot of people talking about differential privacy as one way to get there. What are your thoughts on that?

Jake Ottenwaelder:

Yeah, I've done statistical analysis around differential privacy, actually implementing it and stuff, which I think I was very privileged to be able to do that. It is a very interesting and hopeful space. When I looked at how to do anonymous data, well, the biggest thing that I think everybody agrees upon is there has to be some uncertainty in the data. It's unfortunate, but the more uncertainty that we add to the data, technically, the less valuable it becomes.

Jake Ottenwaelder:

There's always going to be this trade-off of you could have a perfectly anonymous table of ones and zeros those are the only values that are there but it won't be very valuable. Differential privacy helps add some of those elements to it; but, I would caution by saying just because you implement differential privacy doesn't mean you're going to reach a state of anonymization, because differential privacy still has some limitations. You could have, if you're using an Epsilon value that's too high, you could inadvertently. . .that's not really privacy preserving anymore. And, I know some organizations have very high Epsilon values, but they still say we're using differential privacy. Well, you're ruining the technology if you just have an Epsilon of 100 when a normal value would be 1. That gets exponentially worse.

Debra J Farber:

So, what would make that better? Would having a formalized standard that people can claim that they met the requirements of a differentially private standard - like Epsilon must be under this, auditability of that - to make it so that anyone who claims that differential privacy techniques and compliance with that is claiming that they're meeting certain Epsilon values and privacy budgets and stuff.

Jake Ottenwaelder:

Yeah, that would be great and I know there are conversations. I think differential privacy works really well

Jake Ottenwaelder:

when you're looking at a point in time and you're looking at a privacy budget for a point- in- time query.

Jake Ottenwaelder:

If you're trying to pull statistics once and you need that. It doesn't work, if you're trying to, to your point, release a dataset, that is - or not, that it doesn't work, but it becomes a little more risky when you're trying to release a dataset that's going to exist for a long period of time because anonymization is not a one and done thing. It's an ongoing assessment that needs to constantly be reviewed because new datasets could come out that could re-identify it. Tomorrow, Facebook could open up a new API that gives a bunch more data that would allow us to re-identify some data that they had previously released. So, it's a constant review and I do know of some organizations and startups - I'm not paid to mention their names, so I won't - but there are startups who are trying to develop. They do an analysis of your data, they give you masking techniques, and then they partner with organizations and legal firms who would sign off on us as a legal firm or willing to say that this meets our standard of anonymization. That gives organizations a little bit more comfort.

Debra J Farber:

That also tracks. That's very similar to what was going on in the HIPAA world for a long time. Yeah.

Debra J Farber:

We all agree that the de-identification standard is no longer. . . even though it's still in law, is no longer good and it's not anonymous. It's not necessarily meeting all the need for the moment, but there always was this use case or requirement under HIPAA that you would have a statistician that you would hire to certify that your data set under HIPAA was impossible to re-identify or there's a ridiculously low likelihood of the ability to re-identify. So, I'm not surprised that - I did not know this - but I'm not surprised that this is now becoming a little bit expansive beyond HIPAA and now you can have tools and techniques from data scientists that can work with the attorneys or whatnot, to actually provide rigor around the process of ensuring that it can't be re-identified.

Jake Ottenwaelder:

Yeah, and it puts me, as a private engineer, in an interesting position because if I'm taking and I'll self-proclaim that I take more of a purist view of anonymized data from GDPR. In a lot of cases I don't think fully anonymized data can exist because, like I said, it's an ongoing living process - that it might be anonymized right now but it might not be anonymized in a couple months from now.

Jake Ottenwaelder:

And so, as somebody as a private engineer, if somebody else is willing to accept that risk, sometimes I need to take a step back and say, o"Okay, if this organization is willing to say that this is anonymous, even if I'm going to do my best to protect it as well as I can, and even if I don't think it reaches anonymization standards, you have to sometimes allow other people to accept that risk for you. I do hope in the future that we get to a point where, like you said, we have a standard around what an Epsilon value could be, or we look at PCI Compliance with the security world. It was a very largely industry-driven standard that was developed and maybe. . . I think privacy is following a lot of the trends that security has been maturing through and maybe that's something in the cards that we're hopefully going to see pretty soon.

Debra J Farber:

Thank you, that's really helpful. Thanks for sharing your perspective on that. Let's now turn our attention to Consent Management. What are some of the current blockers to implementing Consent Management capabilities into organizations?

Jake Ottenwaelder:

I think a lot of private engineers or people who are trying to implement privacy within the organization would say the main blocker around consent management is just this need for everyone to collect and consume data. We want to understand what people are doing, understand their experiences on our website, understand all the tracking and things going on; and, with consent management, we have to take a step back and look at what is the purpose behind our platform and what should we be doing as part of our functionality and what should we make sure that customers might not be expecting that we're doing and that we ask them if that's all right. So, I think the major blocker again is the shift in mindset around being data stewards and understanding that data has a risk and being willing to let people make decisions on their own and taking it sometimes from a business perspective to continue to kind of build that trust.

Debra J Farber:

Got it. That makes sense. So what are some holistic approaches that you would suggest to implement? Consent management platforms and other features that enable compliance while supporting marketing and advertising goals?

Jake Ottenwaelder:

I think from a holistic perspective, I look at again meeting kind of the marketing and advertising teams where they are understanding kind of what their flows are and understanding how can I integrate and how can I integrate my privacy checks, or how can I enable them to be better data stewards and understand when do I need to get tapped on the shoulder to answer a question, or just kind of building that relationship. So focusing on kind of where those marketing teams are at to support the adoption of better advertising technologies, or understanding like where we need redundancy or if we need multiple platforms and how that makes it challenging to look at like data flows and I think understanding again with consent is very widespread it should populate through a lot of the different data flows and their connections that the organization works on. So working with data teams to make sure that's a key functionality, a key centerpiece in how they look at data and what information they're allowed to process.

Debra J Farber:

Makes sense. And then, last but not least for today, we'll talk about what are some of the current blockers to implementing rights management capabilities through DSAR processing into organizations. For instance, processing data, access and deletion not just the request, but then like the delivery of them right.

Jake Ottenwaelder:

Yeah, this is an area that's probably one of the biggest challenges that a lot of organizations have that maybe they don't know because data is, again, a very living thing within your organization that you could delete it in one spot and then it might repopulate there or might trickle down from another location. So, understanding where your data is flowing and the data lineage has been a really interesting area that I think a lot of data analytics platforms are focused on now, like data lineage and customer trajectory. We have to look at the same thing with privacy, like where is the data flowing through? If we delete the data in this one location? Is this the starting point of the flow or is this kind of near the end? How can we continue to go through those flows?

Jake Ottenwaelder:

And the biggest area that I think is still a challenge for a lot of functions and areas or vendors are like being able to audit these data deletions as well.

Jake Ottenwaelder:

How can we prove that we've done something that we've said we've done and how much proof are we able to kind of provide as well? I think those are all pretty big challenges. And then you get into legal concerns around deleting data and having to have those conversations and have back and forth around "do we really need all of this information or are there ways that we can know your customer KYC with financial data, with less information? How can we separate out our data flows so that we have legal data in one cold storage and all of the other data that we're maintaining in more of an active state that can be deleted more easily?" A lot of stuff that takes a lot of thought and forethought and conversations within an organization. Without that, you're going to implement a tool that is just going to again have lack of adoption, not fully cover the organization. In my opinion, it's a lot worse to say we're doing this and find out that you're not doing it versus saying we're consistently building to improve our program.

Debra J Farber:

Yeah, and so are you seeing that as companies are adopting tools, data discovery, mapping and DSAR delivery tools, and such that they're implementing the tools without, maybe, processes first defined as to how those tools will be used? Or is it more? What are some holistic approaches to implementing a streamlined, compliant DSAR process that has minimal disruption to the business?

Jake Ottenwaelder:

Yeah, it starts with, again, understanding the processes of the business and understanding where the data is going. A lot of organizations that I've seen are going to go to the bells and whistles companies in privacy technology and they're going to go ahead and purchase the best tool, but then they're going to sit there and not know what to do with it, or they're going to connect it to a couple systems and think they're done.

Debra J Farber:

Right, right.

Jake Ottenwaelder:

It takes somebody who's done it before and has this mind and understanding that you have to look at every system. Hopefully, IT has a list of systems. If not, you have to build a system inventory and then you have to go through and understand how are people logging into the system? Because if people are logging in through Okta or some other service, then - SSO service - you have to make sure that you're deleting there, or do you have to delete accounts on the application itself? It's about who's accessing the system, what they're doing. Is it a personal account or is it a business account? For instance, if you're a recruiter for a company and somebody says, delete my data, that recruiter should technically go into their LinkedIn profile because they were acting as an agent of the company and delete their messages with that recruit if they no longer want to have their data collected by the organization. That's a massive edge case that I believe should be part of a DSR process.

Debra J Farber:

It makes sense. I mean really, LinkedIn just becomes an extension of your CRM.

Jake Ottenwaelder:

Yeah, exactly. There's no API that's going to do that and you can't force. . .there's no way to do that without it being a manual process. Really, I think DSARs are this stopgap and this is going to be my hot take for the episode. I think DSARs are not the best solution when it comes to deleting or trying to worry about data. I think we should take more of an approach connected to marketing emails or email addresses, that when, if somebody doesn't respond or interact with your platform over a certain period of time, that data should just automatically be deleted.

Debra J Farber:

Yeah. Data minimization and just the principles of just keeping data accurate and up to date and all of those things. Yeah, absolutely.

Jake Ottenwaelder:

Yeah, because I think a lot of DSAR processes are just being utilized by the people who know that they're being utilized or know that they exist. I would tend to say that that would be people who are more highly educated about privacy in general. So, we look at is it a discriminatory practice against people who might not be as well educated? I, as somebody working in privacy, believe that everybody in privacy should be a human right. How can we make DSARs a process that everybody's aware of, that everybody can participate in, even if they don't have the technical background or understanding to be able to do that?

Debra J Farber:

I think that's really great advice. Do you have any words of wisdom to leave the audience with today?

Jake Ottenwaelder:

Yeah, hopefully my voice wasn't too annoying to the rest of the audience.

Debra J Farber:

You have a great microphone you're using. It's been wonderful.

Jake Ottenwaelder:

Thank you. I really appreciate the time, Debra and you having invited me onto the podcast. The last kind of word that I would like to say, or just my final thought, would be around this concept of being a servant leader. I think in privacy, we need to understand that we're providing a service to the rest of the organization and we are leading by example and we are serving others through our leadership and through our advocation. Just understanding that maybe you're not going to see the rewards every single day. Take time for yourself. Make sure that you understand that you are hopefully fighting the just fight, but really being a servant leader and uplifting others to enable them to do privacy on your behalf. That's how we leave the best legacy and continue to grow adoption and education of privacy.

Debra J Farber:

That is great advice. Thank you, Jake, for your servant leadership here and for sharing your wisdom with the larger audience. Jake, thank you so much for joining us today on The Shifting Privacy Left podcast. Until next Tuesday, everyone, when we'll will be back with engaging content and another great guest or guests. Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website, shiftingprivacyleft. com, where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend, and if you're an engineer who cares passionately about privacy, check out Privado: the developer-friendly privacy platform and sponsor of this show. To learn more, go to provado. ai. Be sure to tune in next Tuesday for a new episode. Bye for now.

Introducing Jake Ottenwaeger, Principal Privacy Engineer at Integrative Privacy
How Jake moved from Security Engineer to Privacy Engineer and how he shifted his mindset to foster solutions for privacy
Jake talks about how previous role at Axon helped him shift into a privacy mindset
Jake explains why taking an integrative approach to privacy is important to him
Jake's definition of a 'successful implementation' and what makes for a 'bad implementation' or not as successful
Jake discusses consequences for bad implementations of privacy, like technical debt.
Debra & Jake discuss the challenges of working in privacy at an engineering heavy organization where you want to understand the privacy implications of everything, but usually don't have the ability to do so.
Jake shares common blockers to the deployment of anonymization in orgs
Jake shares the current blockers to implementing Consent Management capabilities into organizations
Jake describes the current blockers to implementing rights management capabilities through DSARs
Jake explains why it's important to have a servant-leader mindset in privacy.

Podcasts we love