The Shifting Privacy Left Podcast

S3E3: 'Shifting Left from Practicing Attorney to Privacy Engineer’ with Jay Averitt (Microsoft)

Debra J. Farber / Jay Averitt Season 3 Episode 3

My guest this week is Jay Averitt, Senior Privacy Product Manager and Privacy Engineer at Microsoft, where he transitioned his career from Technology Attorney to Privacy Counsel, and most recently to Privacy Engineer.

In this episode, we hear from Jay about: his professional path from a degree in Management Information Systems to Privacy Engineer; how Twitter and Microsoft navigated a privacy setup, and how to determine privacy program maturity; multiple of his Privacy Engineering community projects; and tips on how to spread privacy awareness and stay active within the industry. 


Topics Covered:

  • Jay’s unique professional journey from Attorney to Privacy Engineer
  • Jay’s big mindset shift from serving as Privacy Counsel to Privacy Engineer, from a day-to-day and internal perspective
  • Why constant learning is essential in the field of privacy engineering, requiring us to keep up with ever-changing laws, standards, and technologies
  • Jay’s comparison of what it's like to work for Twitter vs. Microsoft when it comes to how each company focuses on privacy and data protection 
  • Two ways to determine Privacy Program Maturity, according to Jay
  • How engineering-focused organizations can unify around a corporate privacy strategy and how privacy pros can connect to people beyond their siloed teams
  • Why building and maintaining relationships is the key for privacy engineers to be seen as enablers instead of blockers 
  • A detailed look at the 'Technical Privacy Review' process
  • A peak into Privacy Quest’s gamified privacy engineering platform and the events that Jay & Debra are leading as part of its DPD'24 Festival Village month-long puzzles and events
  • Debra's & Jay's experiences at the USENIX PEPR'23; why it provided so much value for them both; and, why you should consider attending PEPR'24  
  • Ways to utilize online Slack communities, LinkedIn, and other tools to stay active in the privacy engineering world


Resources Mentioned:


Guest Info:

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Jay Averitt:

I think that's where the biggest challenge is for companies - having privacy, have that voice, and having that voice early. Instead of, "hey, we're about to release this, does this look okay? If you're in that situation where someone just says, "Hhey, we're about to release this, does this look okay? I mean it's going to be nearly impossible for privacy to make any kind of dent. You might be able to be like h"Hey, there's this one thing you should do. But, you can't really hold up a release. But, if you're embedded early in the design process, that's where a company really shows maturity from a privacy standpoint: how early is privacy being consulted? And then, on top of that, is privacy actually being listened to as part of the process?

Debra J Farber:

Hello, I am Debra J Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans and to prevent dystopia. Each week, we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models and ecosystems. Welcome everyone to Shifting Privacy Left. I'm your host and resident privacy guru, Debra J Farber.

Debra J Farber:

Today, I'm delighted to welcome my next guest, Jay Averitt, Senior Privacy Product Manager and Privacy Engineer at Microsoft. Jay transitioned his career from a technology attorney to Privacy Counsel and, most recently, to Privacy Engineer. Jay is an active member of the privacy engineering community. He serves as an evangelist in this burgeoning field. I'm delighted to interview him today on this Data Privacy Day episode. This episode is actually going to come out about two days after Data Privacy Day, which is January 28th every year, so we're going to chat about his career path, his experiences in privacy engineering roles at companies like Twitter and Microsoft, some of the community projects that he's working on; and, of course, ways to spread privacy awareness. Welcome, Jay.

Jay Averitt:

Hi Debra! Thanks so much for having me. I've been a big fan of yours and this podcast for a long time, so it's really an honor to be on the podcast. So, thanks again for having me.

Debra J Farber:

Oh, absolutely. Well, thanks for sharing your insights. I'm really excited for this discussion about privacy engineering as we see it shaping up in the industry. Why don't we kick off our discussion with you telling us a little bit about your origin story? I have some follow-up questions I'm really eager to ask you about, but I feel like it's going to set the tone for the rest of the conversation.

Jay Averitt:

Yeah, sure. I grew up just being really infatuated with technology from an early age. It was funny. Just the other day, my wife was looking at a video of all of us talking about what we wanted for Christmas as children, and I was the ninth grade at the time, and she came back to me and she said, "What exactly did you ask for for Christmas?

Jay Averitt:

And, I said that was a 14.4 baud modem. And she just laughed and she's like, "God, you were such a dork then and you know that kind of sets the tone of how I was as a child and just really infatuated. I grew up thinking, you know, h"Hey, I'm going to maybe do something with computers. But I also was really interested in the law by reading books. The John Grisham books really got me interested in law and my dad had a. . . he wasn't a lawyer, but he was a risk manager, so he had a lot of workings with lawyers and I got a lot of contact with lawyers and so I was also interested in that.

Jay Averitt:

When I got to college, I was trying to decide if I was going to major in something technology- related or go to the traditional political science route and then go to law school. Luckily, my dad really encouraged me to major in something technology- related and so I majored in Management Information Systems, graduated college with a Management Information System degree. It was during the boom in 2000 where lots of people were looking for programmers. So, I ended up getting a job at NCR as a C++ programmer and I started seeing a lot of the programmer jobs being shifted off to India and overseas.

Jay Averitt:

During my first couple of years there, I got a little nervous. I was like, "whoa, maybe there won't be software engineers in the future. Well, you know crystal ball. Now I can see that that was completely wrong. But, I was like, "maybe this is the time to pivot and get a law school. So I went to law school and in law school looked for ways to use my law degree with technology and tried to think about how I could do that, and started out doing software license agreements, which was great. But, at the end of the day, when you're working on a software license agreement, you could be working on a contract to sell a house or sell a car or any of that. So, it wasn't really playing up to my tech skill set. But in 2010, 2012, I started seeing cloud really come into play into what I was doing from a software licensing experience and I got really curious about the privacy impact. In the early 2000s, everything was being installed on computers themselves or servers, but not being installed out in the cloud somewhere else, and so I started thinking about privacy. Back then, got my CIPP certification way back in 2012, when it was all pencil and paper and just really planted a seed. When I was primarily a software licensed attorney, I really looked for ways to do privacy stuff. Every time I had an opportunity to do privacy stuff I just really loved it. So, that kind of led me down the path of, "Hey, I'm not really loving this practicing law and looking at contracts all day. What can I do that is privacy related, but not doing that.

Jay Averitt:

And until the GDPR hit, there really wasn't an avenue for privacy engineers. I mean, there were people obviously working in privacy from the tech side, but HIPAA and GLBA, things like that were just not big enough to support a lot of folks working on the technology side of privacy. But when GDPR hit, that sort of changed, and I thought consulting was kind of a way to tow my way into those waters and so I did some consulting for a couple of places, and PWC was the biggest place. I spent some time doing consulting and I really enjoyed that work, getting to see lots of different things. But, I wanted to really focus on the tech. That's just where my love was and working specifically on privacy tech projects for tech companies, and so when I got an opportunity to do that, I did and made the transition to working as a privacy engineer. I've got to tell you I've loved that transition and my days are much happier looking and working on privacy tech projects than looking at contracts all day.

Debra J Farber:

Oh my God, do I hear you on that. Like me, you shifted your career left addressing privacy earlier on from a focus on law to engineering. I went to law school, too. It's not a common story, but I definitely feel a kinship with you about that because I know the hard road of becoming a lawyer and what it takes to reframe your legal and GRC mindset towards privacy and data protection to an engineering mindset - building an architectural design and actually implementing privacy by design into the tech stacks and systems and people, processes and technology. What was your journey like from privacy lawyer to privacy engineer in terms of mind shift?

Jay Averitt:

Yeah, I think it is as you correctly indicated. It is a big shift in the way you are thinking, because so much of being an attorney is really looking at the risk of. . . and I'm not saying you're a traffic cop as an attorney, but you end up saying 'no' quite a bit and figuring out. . . obviously, good attorneys try to figure out ways around saying no'no,' but you're not focused in on the technology innovation happening. That's really not what you're doing. You're really looking at okay o"kay what is the risk to the organization, how can I best protect the organization? And obviously you want the business to be successful, but you're not looking and focusing in on the technology around it.

Jay Averitt:

So, I think that that was a difficult shift and I really just wanted to. . .I just did the work and ended up spending 750 hours in a software engineering boot camp so I could get up to speed on Python and JavaScript and what was happening and there; and, while I don't really code much, it was just great to see what architectures were out there from a tech stack and just get excited about creating stuff again. I think that's the real mind shift. Yes, in my role I spend a lot of time trying to help engineers. I say, "I really think what you're doing is super cool. What can we do from the privacy side to make it a value add? So yeah, I think the whole mindset is a bit different from the legal side and was refreshing for me to make that change.

Debra J Farber:

Yeah, yeah, I think you made that change at the perfect time. As you were saying, it was right after GDPR went into effect and there was a real need to technically address a lot of data protection requirements, and you just couldn't do that without privacy- knowledgeable engineers. For me, one of the challenges that I've had my whole career is that I could see where privacy and data protection was going to go. I could see where it needed to go and it felt like the world was slow walking its way to shifting left. It was very frustrating for me. For me, it was so clear just because being in privacy for 18 years.

Debra J Farber:

I went and I got my Certified Scrum Product Owner certification because I truly believe that you needed embed privacy into the product development process; and I'm like, "this is where it's going to be and I did this back like almost a decade ago at this point, maybe longer, before they had roles, before they had roles for privacy product managers and it just I gave up on trying to make that happen when there was no political will in companies to hire for that.

Debra J Farber:

So, I've had to kind of create different career path entry points for what interests me. Just to give you a little perspective of my challenge - I didn't have the information management system background. I was an English major with a Business minor and then went to law school. Right? So, I have had to self-learn a lot. But, the process of learning is what is, I think, such a big draw for me. What do you think around that? I've seen your posts on LinkedIn about constantly learning and that that's a requirement really for privacy engineering. Do you want to expand upon that a little bit and add some of your thoughts about it.

Jay Averitt:

Yeah, I mean, I think you're right. I mean, I think a couple of things you just said they're actually ranked really true with me, and particularly the slow- walking privacy to where it is. And, we're still slow- walking at this point, I think, because in my mind, privacy and security should be fairly close. Your organization should look at privacy almost the same as it does security, because their equally as important. But, at this point everyone knows security is super important. They know privacy is somewhat important, but I don't think it's really gotten where. . it doesn't ring true to companies exactly. That doesn't have a one-to-one correlation. I think we still have a ways to go and I think we will get there, but we're still kind of slow- walking.

Jay Averitt:

Going back to the learning part of that, yeah. I think, to be in this field, you have to want to learn because, not only are privacy laws changing all the time and there may be different regulations coming out. Just for Quebec had a new privacy law come out and I had to look at things that I was doing from an advertising standpoint that I hadn't looked at before, so you might have something like that come out. But also, the tech is just constantly changing. I mean, AI has always been around, but these LLMs and being incorporated into absolutely everything didn't exist until the last year or so.

Jay Averitt:

Then, looking at what that means from a privacy standpoint is something that is really new and novel and folks are scrambling trying to figure it out. So, you have to want to learn and you have to really enjoy that learning aspect. I mean for me, I mean that's how I knew that I was making the right shift out of law into more of a tech-focused role was where that bootcamp I was part of. I was working on a project and I was sitting there looking at code and working on my code and before I knew it, something like six or seven hours had gone by and I can tell you I never looked at a contract and lost track of time. So, I think you have to be fascinated with technology and be able to just appreciate learning new things to really excel in privacy. And, I think that's one of the best things about privacy is that it's sort of a haven for those that really like the life long learning aspect.

Debra J Farber:

I 100% agree. That, for me, is the. I never get bored. I have ADHD, I'm neurodiverse. I found an area where I will never get bored and, as you mentioned, there's brand new technology all the time. Maybe it's XR. Maybe it's cloud. Maybe it's AI. What's great is that now that we have privacy engineers that exist, and a lot of them in the big tech firms that have the ability to scale and bring to market technology rather well, they're going to need to figure out, "how do you strategically do this.

Debra J Farber:

I feel like privacy engineering is more now about the strategy of architecting it and designing for it before you're ever even going to the software development lifecycle. There are so many other engineering aspects to think about that you could actually strategize and be part of the well- architected way of deploying these technologies for privacy, data protection, security, trust, safety - all of the tangential things that you care about in addition to privacy and data protection. So, for me, I just love that it's shifting from a compliance and risk - well, there's still risk but from a compliance standpoint and legal standpoint too, more thinking about how could we strategically work with teams to bring this to life and be part of the innovation development process. Not the people who always say 'no' and make it harder, but the people who enable you to now unlock the value of your data, maybe in data science, and be able to share data or use data and train on data where previously you couldn't because you're deploying privacy enhancing technologies or you've got a new architecture that has the privacy constraints built into it so that misuse of data can't really happen, and things along those lines. So, I'm just loving the fact that we could really be part of the value proposition. I mean, again, 18 years in this space. It's taken a long time to finally for me to finally feel like I've got the roles - and I work for myself in the consulting and stuff - the roles that feed my happiness.

Debra J Farber:

Yeah, so speaking about 'feeding your happiness,' can you compare for us your experiences working at Twitter versus Microsoft? Obviously, I don't mean in good, bad, like 'spill the tea'. I mean that would be nice, but that's not what I'm asking for. More of, just compare the ways that the privacy is set up and maybe some of the goals of the company might be different. I know Microsoft has a really distributed privacy responsibility based on each team, each product line, along those lines.

Jay Averitt:

Yeah, it's funny. I think Twitter, working at Twitter, is probably about what you think working at Twitter is from the outside world. [Debra: Pre-Musk or post-Musk?] Yeah, I was going to say in the pre-Musk era. I can't really speak, to what it' s like to work at X. But, in the pre-Musk era, Twitter was this app that happened by accident. The founders of Twitter didn't know exactly what it was going to be and it turned into this giant app that everyone was using and became extremely popular. So, with that, they didn't really know what they were doing from a privacy standpoint or weren't even thinking about it at the time. So, they had a lot of stuff happen.

Jay Averitt:

The FTC came in, made some consent orders; and so, they really put, I think, a good team in place to make their privacy program more mature. I think they had Lea Kissner in there, who's just amazing in privacy, probably the most prominent privacy engineer, or one of the top two or three, if not the most prominent; and the whole team there at Twitter was really great and just a bunch of super smart folks. But, the program itself was really sort of. . . the framework was there to really get things done. I mean, even building a framework is difficult. I've seen companies, especially startups, that they just don't have a framework, and until you have that framework and program in place and they're convinced the org to actually shift privacy left and have engineers and privacy engineers involved in the design process, it's tough to put that program in. Just even knowing where your data is and all that is top. So, Twitter had all of that foundation in place. It was really kind of trying to get buy-in from engineers to follow the framework that was in place. None of the processes were formalized to the degree that they are like at Microsoft, for example, but they had the right framework in place.

Jay Averitt:

As far as the culture at Twitter, I mean, like I said, it's about what you expect. It felt like a very young organization even then. Twitter had been around for a while, but it still kind of had that startup feel to it. And then, transition to Microsoft. I guess, until I got there, I really hadn't seen what a mature privacy org had looked like. I think there's probably a couple of other companies out there that have super mature privacy orgs. I know Google has a pretty mature privacy org and some others do as well, but I hadn't been on the inside to see anything like Microsoft before.

Jay Averitt:

There already was this buy-in from engineers, buy-in from pretty much the org in general that h"ey, look, we're really going to shift privacy left. We're going to consider privacy early on in the process and I've got a team of folks that really get privacy and I'd say, you know, privacy is extremely important. I mean I can't say that every time that there's a privacy problem that is a thousand percent resolved to my satisfaction, but I will say that if I say something from a privacy standpoint, I'll definitely get looked at and it'll be considered and it's really valued, which I think that's where the biggest challenge is for companies is having privacy, having that voice, and having that voice early. Instead of h"Hey, we're about to release this, does this look OK?

Jay Averitt:

If you're in that situation where someone just says, "Hey, we're about to release this, does this look OK?" I mean it's going to be nearly impossible for privacy to make any kind of dent. You might be able to be like, hey, there's this one thing you should do, but I mean you can't really hold up a release. But if you're embedded early in the design process where I mean there's a thousand things like, for example, Microsoft will have a private preview or a public preview before it ever gets to some kind of release and they're meeting with me before they even are testing this out on Microsoft employees. So, I think that's where a company really shows maturity is from a privacy standpoint is how early is privacy being consulted, and then, on top of that, is privacy actually being listened to as part of the process?

Debra J Farber:

That's pretty awesome for a large company. I've not had a positive experience in a large company like that where they have truly understood their mandate for privacy and data protection and staffed the organization accordingly. You know, I am envious of your experience there. I wish that upon everyone, though. I think that that is how it really should be. Right? I mean, we need to get to that point. I do wonder, though, even where I have seen privacy responsibilities, no matter whether it's an engineering or risk or whatnot in a large engineering- focused organization like, for instance, I'm pulling from my Amazon experience, one of the challenges I had was I might have an understanding of my product offering or unit that I'm in

Debra J Farber:

for me that was AWS or Prime Video; but, there was just too much going on across the organization and it was just too distributed a responsibility that, like I really had no idea what was going on outside of our individual business unit. And, I wonder to what extent do you have suggestions even, in the spirit of Privacy Awareness Day how organizations can better, especially engineering- focused ones, can better message "Here's how we can unify as an organization." You know what I mean? So, have more of a unified kind of perspective.

Jay Averitt:

I think you bring up a good point. I mean, you know, there's 220,000 employees at Microsoft, so me getting to know all of them is impossible.

Debra J Farber:

There's over a million at Amazon, over a million. I worked at IBM and I said I'll never work for a company this large again and somehow I ended up at a million person company.

Jay Averitt:

Right. So it's impossible.

Jay Averitt:

You can't know everybody and you can't even know. . . I feel like I know a decent amount of folks in privacy, but I don't know everybody. But, I do think that there's a pretty good unity between, for example, I'm on the Office 365 side, so we're looking at things like Exchange and OneDrive and SharePoint and stuff like that. There are other teams that are working on things like Word and Excel and all of that, and I know who those folks are and there's a lot of overlap between that; for example, if I see something that I'm working on that's impacting their space, I certainly reach out to them and say, "hHey, I think this might look okay from a privacy standpoint, but what do you think? And then vice versa, and then there's things that the other day somebody, ironically enough, was asking me about. There's apparently a OneDrive podcast and I support OneDrive and they wanted to do a giveaway of people who left reviews and stuff like that. And they're like, hey, can you look at this from a privacy standpoint? And I was like, well, this is just different, because usually I'm looking at things, consulting with engineers about their designs and stuff, and I was like I can certainly give you my thoughts on this from a privacy standpoint, but we probably need to involve some people from our marketing privacy team to look at this, because I may not know all the ins and outs of giveaways and there's probably some legal ramifications of that, so maybe we should look at that.

Jay Averitt:

I think it's impossible to know an entire. . . if you're at a giant company like these big tech companies, you can't know everybody. Privacy is not going to be giant, even at these big tech companies, so you can at least get to know a few folks across each division and then when stuff pops up, you can say, "Oh, who's there?" And I guess I've also got the good fortune of having a really senior team and so if I have something that pops up and like, hey, who's in marketing privacy? T hey know." So that's lucky.

Jay Averitt:

But, if you don't have that team, I think making those introductions outside of your matrix into other teams is important because, I think, yeah, unity in privacy is great and building those relationships is great. Even, I guess, just from my post on LinkedIn, I had somebody from LinkedIn's Privacy team reach out to me over our Teams, since Microsoft owns LinkedIn. I introduced myself, and so I don't have a lot of interaction with LinkedIn Privacy, but I was like, o"Oh, that's cool that I'm getting to talk with somebody from LinkedIn privacy. So I think the more interaction you can make is great and I'm really big on just having a community of privacy because ultimately, like we're talking about the whole learning aspect of this. Like nobody knows everything and there's a plenty of blind spots I've got and I need a bunch of people to balance things off of, and so I think we all do. I think the more you can do to reach out across outside of your specific team to build those relationships is great.

Debra J Farber:

Yeah, that's awesome. Thank you for that. We talked about, a little bit earlier, about how privacy engineers can leverage their experience and role to enable privacy design, research, architecture development and data science without being seen as a blocker to business. You know, we talked about that; it is a challenge, but how can privacy engineers actually be an enabler rather than a blocker? What are some tips that you've kind of picked up in the trade?

Jay Averitt:

Yeah, I mean I think it's all about relationships. I mean I think that when I'm talking with engineers, especially new engineers that are going through a privacy review, a lot of them are almost nervous because they don't know what to expect. They don't know what they're looking at, and I try to build a relationship with engineers and say, h"ey, look, we're on the same team. I love innovation and, you know, let me see what this cool thing is you're doing and let's just look at maybe how we can make sure that privacy is being [inaudible]." I mean, I may look at it and be like, "Hey, you know you did it just exactly the way I would have done it and you know there's no need for me to really inject any further privacy into it. But I mean, if there is a need, it's usually like h"Hey, did you consider this? And a lot of times it wasn't intentionally that they made it not focused on privacy. It's maybe they just didn't think about it from that perspective. So I think

Jay Averitt:

the first tip I would say is really build those relationships with the engineers and, you know, try to show them that we're all on the same side and how privacy can be a value- add to the organization. I think the second tip is nobody likes a traffic cop, so just being a blocker and saying 'no' is not going to make you win any friends or make you valuable to your organization. How you can be valuable to your organization is say, "Hey look, this is a way we can make privacy more friendly, make our users love our product more and, ultimately, if the user loves our product more and trust our product more because it has the best privacy features, that makes it more innovative. It makes it I mean, it creates our overall goal of, you know, increasing the bottom line. So I think that's the thing I would suggest.

Debra J Farber:

Yeah. That makes a lot of sense. Appreciate it. Let's turn now to some of the work you do. What is a technical privacy review? I know you're always such a good sharer of information to educate others. You're an evangelist of privacy engineering, like myself, so I really appreciate that. You're constantly posting your thoughts, and I did see that you posted recently about like "people asking me what a technical privacy review is and then answered it, so I figured this is a good spot to ask that for you again. [Jay: Yeah, sure] You have an answer all ready

Jay Averitt:

Yeah, I think technical privacy review is something that I didn't know a whole lot about until I actually started doing it, so it's interesting.

Jay Averitt:

I mean, a lot of it is being concerned about the data flow. You know, specifically, the engineers will bring something to me and the first questions I'm asking really are around "Hey, what new data are we collecting as part of this feature? And then, once we find out what that new data is, figure out what the classification of that data is. So, basically, each organization has different levels of sensitivity of that data, but based on how sensitive that data is and specifically, if it falls into the categories of where the GDPR would require you to honor a DSAR [a DSAR is where a data subject or an individual makes a request to have their information being deleted] we want to make sure that we're only retaining that data for a certain period of time because we want to be able to honor those requests and things like that. So, it's really looking at: 1) the data being collected, how sensitive is the data being collected, and then 2) how does the data flow. So, looking at data flow diagrams to see how the data flows from the front end application to the back end, so really looking and seeing "Okay, you know how long is it staying in that database? For how long? And, you know, just looking and making sure that everything is coinciding with our data retention schedules and all that. So, that's a large part of it. And then, after we perform that review, making recommendations of, h"Hey, maybe we shouldn't be. You know you are retaining it for this period of time. Why do you need to retain it for that period of time? And then asking them, after you ask that question, I'd say, okay, well, you do need to retain it for that long time. Maybe is there some way we can de-identify the data.

Jay Averitt:

So, really looking at all of that, and then there's - it's not just me; there's also we have compliance folks and attorneys on the phone that will really analyze things from a GDPR standpoint.

Jay Averitt:

And so, with that, I kind of build my role as more of an interpreter to the attorneys to say, b"Because looking at these data flow diagrams can get really complicated, specifically like when you're looking at all these repository names and all the different types of data.

Jay Averitt:

So, really being able to distill to the attorneys, h"Hey look, we're, you know, collecting this level of data, the sensitive type, and you know we're retaining it for this long. Is this gonna make sense from a GDPR standpoint or do you think there are some safeguards we should put in place?" Or you know, specifically, if we're looking at GDPR, looking at the data flow to Europe, how stuff is being processed in Europe or whether or not it's being processed in America, and figuring out if we can and compliance also would look at that and to see what we have in our contracts with our customers, to see, hey, can we process this and this geographical area, and things like that. So it's really, I would say to sum it all up, is really looking at: 1) the data being collected - what type of data being collected? and then, 2) that flow of data. It's really the crux of a technical privacy review.

Debra J Farber:

Awesome, thank you. That's definitely some insight into technical privacy reviews and you know, there's not like any one way to do it. Either way there's gonna be different workflows, processes, and whatnot for each company; but, I think that's a great high- level overview of the typical process there. That makes sense. So, like I said, Data Privacy Day, when this is published, will have just happened two days prior. What are some of the Data Privacy Day activities that you're participating in this year, especially around privacy engineering; but, whatever you're involved, in?

Jay Averitt:

So, I'm actually speaking at a Privacy Everywhere Conference at the University of Illinois to talk with Saima Fancy on LLMs in the privacy world and figuring out and talking about AI governance around those, doing a chat around that. And then, I think the day before, on the 25th, I'm actually giving a Fireside Chat for Privacy Quest which I think you're involved with as well.

Debra J Farber:

Yeah, why don't you tell us a little more? I kind of threw this softball because I wanted you to talk it up.

Jay Averitt:

Yeah, I think the work that's being done by Mert Çan is really interesting. You know security.

Debra J Farber:

Just in case anyone didn't hear, he's talking about Privacy Quest. If you don't mind kind of give us a sense of what is that before the analysis.

Jay Averitt:

So Mert Çan kind of gamified privacy and has different games and simulations to show privacy from, instead of just being h"Hey look, here's a quiz on privacy or here let me tell you about privacy. It actually makes it more of a game. And you know, security does this.

Debra J Farber:

capture the flag kind of competition.

Jay Averitt:

Right, exactly. But, privacy doesn't. Maybe some places do, but historically hasn't done it very well. So, the work I think he's doing is really interesting, and he's organized a Festival around that where he's got different AI teams working - one on one side and one on the other - and then he's got a number of Fireside Chats. I think you're hosting a Game Night.

Debra J Farber:

I'm doing a Quiz Night. I'm doing three different quiz nights, but yeah, he's got all of these events to gamify the learning experience for privacy engineering. I mean, he's got a whole platform. That's what Privacy Quest does, but this is, you know, using Data Privacy Day to create its separate set of events over one month, so it's not just Data Privacy Day.

Debra J Farber:

When you're listening to this, at the end of January, this goes on into mid February as well. There's all sorts of events with the two different factions: the AI doomers versus, like the AI optimists, and you have to pick one and you'll be part of a team, because everybody who's joining in, you know, will be completing tasks and such towards experience points for the team. So then you get to play around with a storyline too that like walks you through the setting, and he's using, AI- generated images to go along with the narratives and the games, and they's pulling it all together with different puzzles and learning modules around privacy engineering that is super fun. So, to anyone who's listening, tune in for Jay's Fireside Chat and definitely, you know, just check out the platform too, because it's perfect for privacy awareness any day of the year. But then of course this event is kind of fun too.

Jay Averitt:

And, yeah, super interested in your quiz night and see how that goes. I think what he's doing is super interesting and I think that is needed because really we've not done a great job in privacy of making it fun. I mean, that's actually one of my goals is making a privacy training that's actually fun to take, because I think security does that, but I don't think privacy does that very well.

Debra J Farber:

Yeah, yeah, this is also something you'll see if you ever go to DEF CON. For those who are listening, I go every year. My other half is a hacker. This is a thing you know we do. I've been for like at least six times, so it's like six and eight times, I'm not sure, but I think around six.

Debra J Farber:

Anyway, these are the types of - they call them CTFs - Capture the Flag kind of games that it gamifies because you get like red team versus blue team. You could take different personas, so you can actually say, "Today I'm gonna be a blue teamer and see what it's like to play this game, thinking in terms of defending my systems right Versus I'm on the red team, I'm trying to find the vulnerabilities in it and shifting the mindset in how you would think in terms of attacking. That's traditional in security. And so, for privacy, it's not gonna be so clear of defend and attack; but, you're starting to see that in AI attacks for privacy. You could start seeing how, if you threat model for privacy and what the potential privacy harms are, you could simulate very similar to how security has done it, and this is a little premature secret announcement, but we may be at DEF CON next year. I think this is our plan.

Debra J Farber:

I'm a formal Advisor for Privacy Quest. Sorry, I should probably have stated that upfront, but we're trying to kind of move in there. So many engineers go there. There's an opportunity to capture the interest of the security engineers who have an overlapping interest in privacy and data protection within their organizations. Then, there's, last year - Jutta Williams started a really successful AI Village event where she also did a CTF, but for people to attempt to get LLMs - all the different LLMs that exist, the base models - to output something it shouldn't. Right? And then, gamifying that experience and then being able to then have a comprehensive library of potential challenges that can then be addressed for threat modeling specifically. And that was a success. And so, I'm starting to see more privacy / AI overlapped tangential to security CTFs happening at DEF CON. So, hopefully we can make that happen this year. It's our goal.

Jay Averitt:

Yeah, it's super exciting. I hope you guys do make it to DEF CON. I'd love to see that.

Debra J Farber:

Yeah, I think it's just a matter of. . .it's the logistical aims. Everything at DEF CON is community organized. So, someone has an idea for what they call a Village, a car hacking village or a plane hacking village or a. . t. hese are real villages that happen at DEF CON. They bring in a plane, they bring in a car, Voting Machine Hacking Village right, and that's hardware and software. But, you know they have so many different villages and so it's just a matter of getting the space, getting on the agenda, making sure you have the internet connectivity, like a dedicated safe line, because you're certainly not going to use the most attacked network in the world, which is the DEF CON Wi-Fi network during that week. All right, I'm talking way too much about this, but I'm really excited. Okay, I was also delighted to see that we both are sitting on the Programming Committee for the PEPR 24 conference, the Privacy Engineering, Practice and Respect USNIX Conference. Are you excited for the conference this June and how has attending PEPR been of value for you?

Jay Averitt:

Oh my gosh. I mean, yeah, I'm super excited. PEPR' 23 - I've never been at a conference that I actually got so much value out of. I mean, usually a conference, you kind of just 1) one.

Jay Averitt:

I never find real value in the programs being presented. I mean there may be one or two out of 50 that you find interesting and then, like, the networking aspect of it is so difficult because there's so many different people doing so-. . . I mean, for example, not to throw IAPP under the bus because I'm speaking at the Global Forum, too. But, it's just a different atmosphere. When you've got so many lawyers and so many people in privacy doing so many different things, it's hard to find people doing exactly what you do.

Jay Averitt:

At PEPR, while privacy engineering is a big umbrella, you're surrounded by like 200 people that really get what you do. It's great for networking and the programs, I mean they were so good. Even some that were above my head, because I don't understand. . . I understand differential privacy and what you're trying to accomplish, but I can't do it. It's fascinating. So, yeah, it was, like I said, the best conference I ever attended. Super excited about going this year and I mean I plan on going every year that it's available and I hope everyone does honestly, because I think that sense of community we have and privacy, I think it's great and it really getting just to chat with all these folks who I'd interacted with on LinkedIn or in other places was just great and it was easy to do instead of having to seek people out at a giant conference.

Debra J Farber:

Yeah, I agree with everything you just said. It was, I came back just feeling on a high. It's a two day conference and you might remember me up in front sitting next to Jason Cronk writing constantly - I'm old- school; I still write notes. Typing isn't the same thing for me. My brain learns, kind of like as I'm writing, too. I'm thinking here's all these amazing topics and interesting presentations, potential speakers for this podcast - just a wealth of information.

Debra J Farber:

I was just really astounded by, again, the networking opportunity but, everyone was excited to be there. No one was there because " all my company sent me to have to be here. This was something people fought to get budget to go, not that it's expensive conference. It's actually pretty reasonable. It's a nonprofit and everything, and it's not vendor- heavy or anything. It's pretty much sponsored by companies that want to hire privacy engineers, so it's like they have their recruiters there, if anything. I met some of my heroes. I think I even met you there for the first time in person.

Jay Averitt:

Yeah, we did. Yeah, we did meet there for the first time.

Debra J Farber:

Yeah, it was absolutely wonderful and I urge people who are interested in privacy engineering, or if this is your main focus, this is a conference not to miss. It was just such an exchange of ideas. I mean, everyone in privacy engineering, too, is coming from different perspectives, right? Somebody who's deep into differential privacy is not usually deep into another PET. Right? And so, the people who are deploying PETs, like there's different libraries, you know open source libraries; they know the different deployment mechanisms; they're deep in maybe the data science, but they're not necessarily crossing over into being deep into all of the privacy enhancing technologies. So, as those people were talking about those deployments and you know, we heard from governments, like the government of Singapore, on how they did some implementations of interesting things. I'm trying to remember what it was, but it was just a cross-section of industry and academia and evangelists and just a happy place, and I know that this conference is going to have thousands of people in the future.

Debra J Farber:

I believe that. I mean, I totally see that. So, if you really want to get in on it now, where it's small enough to feel super manageable as a conference, easily be able to talk to anyone there - everyone's just excited to like meet others who are interested in this space and share ideas. I'm still working on a project that I found by attending PEPR and meeting a company that's got a consulting firm and I'm a subcontractor to them now working on a California DMV mobile driver's license and verified credentials project. Like I can't tell you how many dividends my attending PEPR was, and so thank you for sharing yours. I feel like I'm going on and on about my experience, but I wanted to, since we're both Programming Committee members, I just want to underscore that we jumped on being on this Program Committee because of how wonderful an experience it is. It is my delight to volunteer.

Jay Averitt:

Absolutely 100% echo everything you just said. I can't speak highly enough about the conference.

Debra J Farber:

Awesome. So, we're getting towards closing. I know we could go on all day just talking about our love of the space, but I'd love to hear, as people try to keep up with what's new in the discipline of privacy engineering, what resources you refer people to and what communities you might plug into. How do you stay up- to- date?

Jay Averitt:

Yeah, for starters, listen to this podcast; but no, I think podcasts are actually a good source of information, not just this podcast, but other privacy podcasts out there. I learned something from listening to them because privacy engineering is such a big umbrella. Everybody's doing something a little different. So, I love hearing other people talk about what they're doing and I learned from that. But, I think there's also. . . I think LinkedIn,

Jay Averitt:

actually looking at your feed, I think connecting with a bunch of folks that are in privacy and looking at you know what's popping up is great. I mean, I learned a lot from just seeing stuff pop across my feed and I think that's a good source of information. As far as like building community, I think LinkedIn is another way of doing that. I think there's various Slacks out there that have tried to create a privacy community, but there's not one that I can fully recommend at the moment. But, I think LinkedIn and listening to podcasts really are the primary ways I do it.

Debra J Farber:

Interesting. Those are the primary ways I do it, too. I was curious if there were any Slack groups that you're like "Oh, this community is just totally hot right now, but I haven't really found one myself. I mean, I do know that there's like communities like openmined d. org, right, where, if you are actually like working with data science tools and super technical and want to understand deploying PETs to unlock the value in data science of your data sets, you could go really deep. There's 16,000+ community members in their Slack group that are actively working on deploying these things, taking their free courses and then asking for help from mentors there that volunteer to help. But, it's not one that I participate in because, again, it's too technical for my purposes. Yeah, those are the ones I follow too.

Debra J Farber:

I used to follow a lot more on, you know, Twitter. I'm just not on X anymore as much. I mean, I occasionally go there to see what crazy is going on on the platform, but I don't really use it. I think there's an opportunity out there for communities to be stood up. I think there's a thirst from privacy engineers to be able to ask questions of other people, and so maybe I'll try something with Shifting Privacy Left brand, but I would need some partners to reach out. If anybody wanted to work with me and maybe Jay and maybe some other evangelists out there too, bring that to life. Just before we close, what words of wisdom do you have for getting into privacy engineering?

Jay Averitt:

Yeah, I mean, I think it's if you've got that passion and that love of learning and I think there's some resources out there. I think the privacy classes that are on it, like Privado and stuff, are good places to start, and I think there's some books out there that are good, like the 'Data Privacy: a runbook for engineers.' Things like that are good to give you an overview. But, you know, I also think you know if you're really wanting to build up your tech skills, coding is one way to do it; but, I think really more important than that is understanding data flows and being able to understand and articulate, distinguish between different data types and then being able to understand how the data is flowing from the front end to the back end. I think those are critical skills. So, yeah, I think that's great and I think that also looking at LinkedIn and looking and seeing what people are posting who are active in the field is a good way of seeing different ways. You can kind of break in.

Debra J Farber:

Awesome. Well, Jay, thank you so much for sharing your experiences and insights, and thanks for everyone else for joining us today. Until next Tuesday, when we'll be back with engaging content and another great guest or guests. Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website, shiftingprivacyleft. com where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend. And, if you're an engineer who cares passionately about privacy, check out Privado: the developer-friendly privacy platform and sponsor of this show. To learn more, go to privado. ai. Be sure to tune in next Tuesday for a new episode. Bye for now.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The AI Fundamentalists Artwork

The AI Fundamentalists

Dr. Andrew Clark & Sid Mangalik
She Said Privacy/He Said Security Artwork

She Said Privacy/He Said Security

Jodi and Justin Daniels
Privacy Abbreviated Artwork

Privacy Abbreviated

BBB National Programs
Data Mesh Radio Artwork

Data Mesh Radio

Data as a Product Podcast Network
Luiza's Podcast Artwork

Luiza's Podcast

Luiza Jarovsky