This week, we gain insights into the profession of privacy engineering with guest Menotti Minutillo, a Sr. Privacy Engineering Manager with 15+ years of experience leading critical programs and product delivery at companies like Uber, Thrive Global & Twitter. He started his career in 2007 on Wall Street as a DevOps & Infrastructure Engineer; and now, Menotti is a sought-after technical privacy expert and Privacy Tech Advisor. In this conversation, we discuss privacy engineering approaches that have work, the skillsets required for privacy engineering, and the current climate for landing privacy engineering roles.
Menotti sees privacy engineering as the practice of building or improving info systems to advance a set of privacy goals. It's like a 'layer cake' in that you have different protections and risk reductions based on threat modeling, as well as different specialization capabilities for larger orgs.
It makes a lot of sense that he's held weaving roles from company to company. His journey into privacy engineering was originally 'adjacent work' and today, he shares lessons learned from taking a PET like differential privacy from the lab to systematizing it into an organization to deploying it in the real-world. In this episode, we delve into tools, technical processes, technical standards, the maturing landscape for privacy engineers, and how the success of privacy is coupled with the success of each product shipped.
Copyright © 2022 - 2023 Principled LLC. All rights reserved.
Debra Farber 0:00
Hello, I am Debra J. Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans, and to prevent dystopia. Each week we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy, research and emerging technologies, standards, business models and ecosystems.
Debra Farber 0:27
Today, I'm delighted to welcome my next guest, Menotti Minutillo, an experienced Senior Privacy Engineering Manager with 15 years, leading critical programs and product delivery in privacy, information security, compliance, corporate development and API products. His background is a unique blend of engineering and engineering management, technical program management, corporate development, and product management. Menotti is a sought-after technical privacy expert who serves as an advisor for several privacy tech companies, including Opsware Data and Dasera. Menotti has led privacy engineering teams at Uber, Thrive Global, and most recently, Twitter. In fact, he worked at Twitter on two separate occasions.
Debra Farber 1:14
Today, we're going to talk about privacy engineering approaches that earn trust for customers and partners; the different skill sets that are part of privacy engineering; and, the current climate for privacy engineering positions.
Menotti Minutillo 1:32
Thank you so much for having me.
Debra Farber 1:34
Oh, yeah, I'm really excited for this discussion about privacy engineering, as we see it shaping up in industry. And, to kick off the discussion, I'd love to hear about your origin story. Can you share with us how you ended up working in privacy and then taking on privacy engineering roles? You
Menotti Minutillo 1:52
Sure. I guess my journey into privacy and privacy engineering is a bit of adjacent work, and some happenstance, and good luck from some very kind people who took some chances on me. So for that, I'm really grateful. I started my career in 2007. I worked on Wall Street, and I was kind of a DevOps and Infrastructure Engineer. I had a concentration in college in information security, so I guess that was enough to be thrown on to email routing system and anti-spam system for a big bank on Wall Street. So, through those responsibilities, I had spent a lot of time with folks in network security and information security because all of these things were kind of coherent with one another. So, in 2012, I moved over to Twitter for the first time, and I was responsible for Technical Program Management within the Information Security team.
Menotti Minutillo 2:46
I worked on things like identity management and SOX for IT general controls. My transition to privacy was after my time at Twitter when I went to Uber in 2016, and privacy was more of the sole specialty there. They brought me in to get the GDPR Readiness Program set up in Engineering and work with the teams there to tackle some of the fundamental stuff like data lifecycle management. So, this was where the transition point was. And it was a...you know, kind of another example of somebody taking a chance on me due to the kind of adjacency of my previous work. So, having worked with a software team there within the Information Security Group, the CISO offered me an opportunity to manage that team instead and transform it into kind of a more formal Privacy Engineering team. So, this was a software team, and we had kind of a range of skills from backend engineering to mobile and front end web; and we wound up building a bunch of those services, but also the user features that helped Uber's kind of overall privacy mission and transformation.
Menotti Minutillo 3:49
I spent a short time after that helping this company called Thrive Global set up their privacy and security program before returning to Twitter, where I tried my hand at Product Management. And so, I spent about a couple of years there on a bunch of projects around API abuse, as well as helping them build out these long-needed set of improvements to authorization and authentication in a public API.
Menotti Minutillo 4:13
And finally, kind of most recently, I swung back into Twitter's privacy organization; and, I started out as kind of a Chief of Staff to the CISO, while at the same time helping build out one of the teams within the Privacy Engineering org. So overall, I find that I kind of direct my energy - and I went up floating towards areas that have a lot of need - which is why I've kind of gone between different job profiles. But, there's been this common through line of privacy and security throughout it all.
Debra Farber 4:43
That makes sense, and probably a through line that also includes trust, customer trust, stakeholder trust, generally. So, I could see...I mean, very often you have people coming from security into privacy and maybe not as much vice versa, but sometimes there's overlap so far. Yeah, I mean, I got into privacy first and then into security, and then got my CISSP and did all that as part of consulting. And, seeing the bigger picture, you know, you want to...basically, I urge people to get out of their domain and to understand the periphery. So I think that makes a lot of sense that you've had these weaving roles that kind of aren't just one position from company to company. And that makes me ask, like, how would you define this term we're using these days of "privacy engineer." I know it's in the immature days, and if you ask different people, you're gonna get a different answer. So, I'm gonna ask you to understand your perspective; and then, what are the skill sets you see as needed for privacy engineering roles as we mature as an industry?
Menotti Minutillo 5:40
Yeah, absolutely. I would agree that we're definitely in the early stages of defining privacy engineering, and you know, early is kind of relative. But, I would say it's, you know, really only a more of a mainstream terminology over the last 10 years or less. There's been some companies that have been at it a little bit longer, but they were pretty unique in that and now you kind of see it everywhere. To me, privacy engineering is this practice of building or improving information systems to advance some set of privacy goals. And usually, that involves protecting personal data, or to borrow a term from Lea Kissner, building and kind of maintaining these "respectful information systems." So this is can include things like tools, technical processes, technical standards.
Menotti Minutillo 6:28
I think, to your question about like skill sets and what makes a privacy engineer is a little bit tricky because you're seeing so many applications of that term right now that even from company to company, or sometimes even inside the same company, the skillset and expectations look a little bit different. And it's catching...the term is catching a lot of different skills I think in the same bucket and the same terminology. And it would kind of be like if you were asking to define a "security engineer," I'd probably ask like, "Well, what type of security engineer? Is it infrastructure? The application security person? Or ,are they a product security person? So, right now, it's like this kind of immature terminology, which is totally fine. I think, you know, over time we're going to see probably some different naming conventions, as there's like higher specialization and differentiation in skillsets as the kind of technical practice matures, which will be helpful, I think, overall, for the success of the job profile, as folks kind of get those specializations.
Menotti Minutillo 7:24
So, you know, what I've seen personally has been, you know, a range from kind of classic software engineering that happens to work in the privacy space - so, these would be, you know, engineers, writing software, delivering software that are in the privacy domain, which might be a, you know, a set of features or some back-end services that deal with user data. But then, there's also a big piece of this that could be applied research of something that's been studied in the academic sphere and putting that into production environments.
Menotti Minutillo 7:55
Technical Program Management is in there. I mentioned Technical Advisory and that those roles are usually based on some previous engineering or product background, with some understanding of like policy landscape best practices and integrating those best practices into a software development lifecycle. So, to me, it's not just like an operational role, but ultimately being part of building towards those system improvements that lead you to those those privacy successes. So, I think over the coming years, we'll see that increasing amount of specialization and maybe some divergence within the term "privacy engineer" in to like a few different areas, kind of much like we've seen in the information security space, which is overall a bit more mature when it comes to professional career paths.
Debra Farber 8:42
I totally agree with you on that. I think that security really lays out the landscape as to how the path for how privacy is going to mature. So, if you're going to have the same engineers working on certain aspects of the business, you're gonna need your security engineers to help the business at a certain point and assess risk and whether it's risk of their code or whatever, it's a great place to also look at the privacy risks. Right? So there's certain I don't want to say what to call, there's certain like dates at which the security team is looking at certain things; and that's a great place for privacy to do the same without having to change...completely change workflows.
Menotti Minutillo 9:19
Yeah, there's this concept of defense-in-depth or security threat modeling...you know, there's different ways to look at this, but the idea being that security is kind of this layer cake, or you can call it like, you know...this was, you know, lining up the Swiss cheese slices next to each other. There's this idea that you have different protections and risk reductions based on certain threat models and different kinds of capabilities where specialization - especially in, you know, more complex organizations with more complex products - becomes necessary from the infrastructure layer, or sometimes the kernel layer, all the way to what's being built into end user products. And, I think you're gonna start to see some of that in privacy as well. I think how the infosec space has developed over time is a good model for this, thinking about where this will probably go with privacy.
Debra Farber 10:08
Right, with the small caveat of you're looking at information flows; where in security, you're not so much looking at that. You kind of like locking down the data for confidentiality or you're looking at, you know, other aspects. So while it's a similar paradigm, it's definitely going to be slightly different. There's different risks and risk modeling needs. So yeah, I think I think that makes sense.
Debra Farber 10:30
I know that many of those who hold privacy engineering positions work in Big Tech. I mean, that's just where there has been so much opportunity...such immediate need for privacy engineering roles; and, this is where companies have tons of personal data to protect. And, they're often under FTC consent orders to account for their practices to, you know, a greater degree than is even normally required under law, just because they've typically gotten themselves in hot water before and now have to go through like 20 years of audits to satisfy the consent decree; and there's much public scrutiny. And those companies, they needed to deploy technical approaches and assurances at scale. So, can you tell us what it's been like to work at companies like Uber and Twitter where there are multiple privacy engineering teams? And, what approaches do you see as best practices in setting up these teams and deploying solutions?
Menotti Minutillo 11:21
Yeah, I would say Uber and Twitter were a little bit different in their approach, and not necessarily in terms of what they thought was important or not, but more about the order of operations. So, I've seen these programs develop through kind of different areas of their lifecycle, in some ways, depending on where they started. So Uber was, at least in my experience, a little more 'from scratch' during my time there. So, I got to see it from basically...I don't want to call it zero because there wasn't zero, but it was a little bit more sort of diffuse through various engineering teams until it was somewhat unified under one header, which was within our security organization, where we actually had kind of an end-to-end responsibility from the backend services all the way to the user features.
Menotti Minutillo 12:11
At Twitter, it was also kind of spread out between lots of different teams with privacy responsibilities and also wound up getting put under one roof. Kind of a key difference was that Twitter kind of kept the consumer privacy features in a separate organization that was kind of more under the Consumer Product header. So, I mean, these are...these are vastly different products. Right? So, you know, Twitter being a consumer product with lots of user-generated content and features that are related to the relationships between various users, there's a lot of privacy work that would be done. That's kind of core to the everyday user experience. Whereas Uber, a lot of the privacy stuff was, you know, a little bit more maybe obfuscated from the end user in the sense that like, they're not interacting with it sort of as tangibly, on a day-to-day basis. So some, like these different approaches, made some sense and their individual circumstances.
Menotti Minutillo 13:09
I think, for me, I'm opinionated on this due to my personal experience; I don't want to say, "It's right," but my personal experience, I liked the approach where we were able to get a team that was multidisciplinary. So, we had software engineers under one roof at Uber that included responsibilities for the end user experience and for the user features. And not only...not only were they under one roof organizationally, but they were actually under one roof on one team. So, literally, in one team, we had mobile engineers, backend engineers, front end web, and some of the advisory functions under kind of like one team. And when I say 'team,' I mean like, you know, kind of a nuclear group of 10 people. And, that gave us a fair amount of autonomy and visibility. And like, what it took to deliver sort of a seemingly basic functionality - like, you know, 'press this button to delete your Uber account,' there's lots of stuff that has to happen in that flow and kind of that chain, and we were able to grasp the responsibility in one team. And, at the end, that included the end user experience.
Menotti Minutillo 14:14
But I think that worked in that circumstance, which was a kind of a function of the timing of the setup for the team, the capabilities and the sort of skillsets we had for the people around us and different, you know, organizational factors. So it's one way to do it. I very much have been in favor of, if at all possible, to get the end-to-end responsibility under one header. It makes for certainly like a more streamlined environment to get that stuff out the door. There are some downsides to that, of course, in that, you know, maybe those engineers wouldn't be, you know, that the couple of mobile engineers on that team maybe aren't sitting amongst a team of a dozen mobile engineers where they're getting a lot of cross-learning on different topics. So, there are some downsides to this for me for advancing the privacy mission. Having these sort of expertise all under one roof helped us be really effective, and that's generally what I advise for at least out of the gate, if possible. But, you know, it's always gonna be a little bit different for different organizations. And, you know, what works in one doesn't necessarily work in another. So, grain of salt in terms of my personal experience here.
Debra Farber 15:21
Yeah, that makes sense. I worked on some of these efforts at Amazon, specifically at Prime Video, but a lot of it was centrally managed at Amazon. And you're right. I mean, one of the reasons that big tech doesn't, doesn't typically buy off-the-shelf data deletion software is that every company is so vastly different and it's probably better to custom build than it is to buy anything off the shelf. You know, well, most privacy tech is not purpose-built for any particular company's workflows. Right? So that makes sense. What privacy outcome should privacy engineers typically be aiming for within an organization? or what have you seen, if that's an easy way to approach the question?
Menotti Minutillo 16:01
Yeah. And I think about this from kind of two factors in my my background. One is like, who am I looking...like, what skill sets am I looking for in somebody who I would hire into these roles, as well as how if I thought about defining success for particular aspects of a privacy program that actually worked and resonated with folks. And, I think the main thing that I try to stress here around outcomes are there ultimately has to be kind of a degree of pragmatism in the role. You know, much like in security roles, we do a lot of - I'm doing a lot of analogies here, but they're kind of cousins to each other. You know, perfect privacy is to like not use data. Right? It's to like, let's just like shut down the company, not use any data. And there we go, like, we have perfect private privacy, because we're not using data at all. But, you know, you have a company and you're trying to deliver value to customers, and there's all of these other objectives that you're trying to meet.
Menotti Minutillo 16:56
So there's some pragmatism that you ultimately have to have when defining these success criteria. And it doesn't mean like weakening privacy; I think that's a false dichotomy. But, it does mean that your success criteria should try to really align with what the product success looks like. So, if you're going into, you know, evaluate the potential for a product that a company wants to build, and your success criteria are kind of diametrically opposed to what the business outcomes are because you're trying to achieve some like perfect model, you know, it's gonna be very hard to get off the ground and to to make sort of material improvements to the privacy goals. So, I think that's first and foremost is like, how can we get these aligned? Like, how can we get really respectful privacy practices in this product, while still allowing the product to be successful, and they don't have to be sort of this this false trade off? You know, to borrow from from Lea Kissner, who I mentioned earlier - and I really liked the way that they put this - you know, ultimately, is the system or product your building respectful? And that's kind of like a loaded way to think about it. But now, you start breaking it down. Like, do...will the end users feel respected? Or, have we treated them with respect, even if that may not be so apparent to them in like the product they use everyday? Like, can we can we honestly say that that we've treated them with respect throughout this? Has there been a reasonable exchange of information and value in a way that was clear and fair to the end user and that they've had some choice about it, and that we didn't present them with sort of an unfair bargain about the data that they're entrusting us with.
Menotti Minutillo 18:34
And ultimately, there is an aspect of kind of reducing risk for the organization as a result of the work. But, I try to avoid making it the cause for their work unless there's really no better way to define success criteria. You know, it's going to be a big driver of the risk reduction aspects, but I think looking at the consumer aspects around the way that people feel when they use the product: they feel respected; they feel like they understand what they're kind of exchanging and value and for what reason. It makes a big deal.
Menotti Minutillo 19:07
I think, you know, one anecdote from from my time at Uber is we did a lot of research around privacy features for different regions around the world. And what we came to learn - which, you know, it may seem, may seem obvious - but at the time, like, some of the differences were pretty major, like, the stuff that folks cared about with regards to privacy as you would kind of plant flags around the world was was pretty different. And when it came to Uber's case, physical safety and privacy went hand-in-hand. And when we were thinking about privacy features, we were kind of coupling the success criteria with features that had to do with safety because, you know, you're getting in a car with somebody who's driving you somewhere and that was really important to understand people's perception of that as well as what their real realities were and the real risks that are presented by doing something that's untoward in the privacy space. So, I think having smart research driving those success criteria is the best possible way to go about it and teaches you the specifics for your product in a way that, you know, may not be generalizable to kind of the broader market.
Debra Farber 20:18
That's an awesome answer. You know, I heard a lot of tidbits in there from "don't let perfect be the enemy of the good" to with the criteria...I specifically asked about outcomes; and, I think you make a lot of sense there. Right? If we're just looking at a maturity level, we want to get to and obsess about that and why we're not meeting, you know, those criteria that can get you stuck and feeling like it's not...it's not meeting all the needs; but, it's also maybe not the right outcomes to hang your hat on. I also love how you talk about aligning with the product, and that the success of the privacy component is going to be coupled with the success of the product generally and how it's perceived in the market.
Debra Farber 21:02
I think, you know, when people talk about privacy...you know, I'll talk about 'privacy as control' or 'privacy is transparency,' but when it comes to 'privacy and trust,' I do think that there's very much this coupling with respect, the same way that when we talk about 'security and trust,' there's also the aspect of safety. And so, when it comes to trust, that respect aspect is so key. And, I know that Lea has been championing that as part of the privacy engineering practice. In fact, they incorporated it into the PEPR Conference, right, Privacy Engineering Practice and Respect and has really championed that aspect. And I love it. And I think it makes sense. And I am totally here for it. I want others to think about, are you respecting the customer? How are you obsessing over your customer or what respect to their privacy are you giving should really be like an acceptance testing, so to speak, right? Maybe not by the developers, but by the product folks?
Menotti Minutillo 21:58
Debra Farber 22:00
What architectural approaches have you seen during your work and, you know, have some worked better for other use cases? And this is a really broad question, I'm just gonna throw out there that I've recently learned a lot about data mesh architectural approaches. So, I'm like learning more about more of these, like new approaches. But, I'm just curious, like what you're seeing overlap with privacy.
Menotti Minutillo 22:23
Yeah, this is obviously a big potential area for discussion. So, I'll probably like, you know, speak more at the macro level.
Debra Farber 22:29
Menotti Minutillo 22:29
I think when I'm either advising internally or I'm in one of these kind of opportunities to talk to a company that's operating in this space to sell a B2B product, or another company that's thinking about setting up a privacy program, I generally will say that 'the buck stops with, with knowing what you have.' Everything around data privacy is contingent on having a pretty good handle on what data you have. And, it's not a typical case for companies who are getting started - especially, you know, startups who are hitting some traction - to pay a lot of attention to the data that's entering their environment that may be sort of tacked on to other critical business data that's helped driving the product. So, first and foremost, what data do we have, and do we have a handle around, you know, how it's growing and what it is?
Menotti Minutillo 23:23
So generally, architecture for me always starts with how are we doing inventory, and how automated is that? Does does it require sort of a person plugging away to keep the inventory up-to-date or are there systemic approaches that helps keep that at least kind of at bay and in a way that can be observed and monitored for growth or reduction in the size of your overall kind of data environment. And then there, it's kind of a risk based approach to setting up the architecture. I think, when you're dealing in needs around privacy, you have this broad area of data lifecycle management, which is from collection to deletion; and, there's probably you know, depending on your point of view, there might be 6 or 7 different states of data that can exist inside of your company that which may require different technical solutions. So generally, it's a mistake to try to invent all of the pieces at once and then apply it to your organization all at once. You know, after a year of of engineering it, instead trying to figure out like where the biggest risks are to your users, to the business - kind of doing a privacy threat model so to speak. And, this often leads us to thinking about the top of the funnel or 'the left' to go to your the podcast title here. So, can we incorporate systems and sort of different places to catch data and label it, process it, get it into a state that is controllable earlier and earlier in the software development lifecycle, as well as in the lifecycle of the data in our environment?
Menotti Minutillo 24:58
And then, then when it comes to the actual sort of designing of the systems, there's lots of options around...especially in, you know, highly-distributed environments that that may have a kind of a large sprawl of interconnected microservices and you're just trying to like wade through all of that complexity, there are some key decisions to make that are kind of dependent on the state of the organization and going down to even decisions like do you want your systems that are trying to advance privacy to be doing the direct CRUD operations that create read, delete operations on databases? Or, would you prefer to set up something like a Pub/Sub, which is this idea that you would publish a message out to a common message bus that says, you know, "We need to do something with this piece of data," like delete user acts, which then another system picks up and executes on that, which creates a bit more of a distributed responsibility.
Menotti Minutillo 25:52
So, there's all these design decisions along the way that are, you know, it's a whole long line of 'it depends.' But I think, you know, when zooming out, it's knowing what you have and trying to gain some semblance of visibility and control as early as possible in the lifecycle of the data. And then, dealing with later on, where data may have been distributed in a different data warehouse or data lake. I think, for me, getting the sort of inbound and the new growth under control first is kind of a smart way to go about the systems architecture, given that you generally can't boil the ocean, or else you might have missed things by the time you're ready to deploy your systems.
Debra Farber 26:34
Right, and rather than compounding the problem, just start with the new...like, create the ideal new architecture and then address kind of the legacy challenges. Yeah.
Menotti Minutillo 26:44
Yeah, I was gonna say, it's very rare that you get a clean slate to set this stuff up. So, is there a way that you can idealize what you want it to be and get that sort of like, you know, 'happy path' flow established from end-to-end? And then, you could start applying it to what's been sort of happening in the past or the active production system? And again, like this is the approach that I think works really well. You know, the caveat always being like, "Well, what is your environment look like?" But I think, you know, from a philosophical perspective, getting it early in the SDLC, as well as trying to demonstrate a happy path, is a way to ultimately tackle this in a way that the organization can absorb effectively amongst all of the other things that it's trying to do.
Debra Farber 27:29
Right, right. And then, of course, it has the effect of, you know, if you start to do the data minimization, and do a 'shift left' upfront...and the more work you do up front, the less of a paper chase and compliance burden you have in the first place.
Menotti Minutillo 27:42
Debra Farber 27:42
So hopefully, we can also track the money saved by not doing all that.
Menotti Minutillo 27:46
Yeah. It's a nice outcome - a nice to have or cherry on top and say, "Oh, well, it actually wound up saving us a lot of money at the end of the day. Yeah, look at that. Who knew?"
Debra Farber 27:53
Exactly, exactly. Okay, so I've got a few more questions about some of your experience and your background before turning to a new topic I know we want to talk about today. So the first is, I know that while you were at Uber, you worked with some...or your team worked with some privacy researchers at UC Berkeley, and working on differential privacy research, and took that from the lab and then deployed that in the real-world at Uber. And, I really would love to learn a little bit about what that was like or any lessons learned from taking something from the lab (and this was a few years back, so you know, it wasn't as mature as it even is today), what was that like to then systematize it into an organization and actually deploy it in the real world?
Menotti Minutillo 28:37
Yeah, this was one of my favorite projects I think I've ever been a part of.
Debra Farber 28:40
Menotti Minutillo 28:41
Yeah, it was it was very cool. The TLDR on how it got started was our CSO and a professor at Berkeley, Dawn Song were...they were friendly and they had been talking about things that we can do in the privacy space. And, there were some researchers at Berkeley pursuing their studies who had interest in working with Uber data. And so, the actual sort of structural arrangement was we would have these couple of folks come in to sit with us and work as basically long-term interns. So, this was a Noah Johnson and Joe Near; and, they wanted to do some research, like using Uber data. On Uber data, we had a kind of a big operational trove of data that was very, like localized to different regions. And, there was a lot of interesting stuff that you can draw out of data that we worked with, and we thought the sort of safest way to try to get this value in privacy research was to bring them on as interns. So, then it's like, okay, we we onboard them properly, and we give them the right access controls and all this kind of stuff, and they can work in the Uber environment.
Menotti Minutillo 29:44
And so, initially, they were very interested in working on differential privacy as it pertained to SQL queries. So, the idea being that, you know, when somebody...when you have a big trove of data and you're an operator and you're trying to answer some questions like, I don't know, like, 'tell me how many riders took on a certain promotion in a city between these times. And, you know, the SQL queries that an operator at Uber might run to figure out what's going on with the business can be quite complex, and then enforcing some semblance of privacy on an infinitely complex set of SQL queries is really hard because, you know, the human mind is limitless and you can just, you know, try to query all these different things...that you may believe that you have good privacy control around, but then, you know, you add some variables to the query, and all of a sudden, there could be some exposure. So, they were interested in this idea of these kind of unbounded SQL queries and trying to enforce privacy.
Menotti Minutillo 30:40
So, without getting into tremendous detail, Joe does a great presentation at Enigma, USENIX Enigma. I think it was maybe 2018, but if you go search ‘Uber Enigma’ on YouTube, the presentations are there. But, the long and the short of it is we went through a few iterations that ultimately led us to deploying a piece of the technology in Uber production that serves two main purposes. One was to evaluate the SQL queries that were being run internally in real-time to try to determine this potential sensitivity of the output. So somebody's running a query. Who is that person? What rights do they have to what data? et cetera. And, based on that SQL query, what's the potential sensitivity or personal data that might be in the output, including what might be inferred in that output because the data in that output comes from other personal data even if you're not listing somebody's name. Like, maybe you derived some statistics off of that person. And so, we would try to determine what that was based on analysis of the SQL queries.
Menotti Minutillo 31:45
And then, the really cool part was being able to take that analysis of what that output is going to be and in real time modify the SQL queries to enforce privacy before they're actually run so that you don't even send the query that has the potential for a sensitive output to the runtime (which runs the query). So, it's like, examine what's in there; predict what the output is going to be; and then, actually modify the query after the user submits it such that it enforces privacy properties. This was really cool, and it actually was able to help us achieve some interesting privacy outcomes in this very complicated query runtime environment where you had these 1000s of operators trying to like do their jobs at their local city level. And, we were able to apply like kind of much broader macro rules on what people were and were not allowed to do with that data in such a way that didn't require as much like active management on the field level because we were instead doing all of the privacy at the SQL level. So, there's a couple of blog posts about it, which explain it in further detail.
Menotti Minutillo 32:52
But, I think what I learned from all of this and the really awesome output we had was there's all this great opportunity to partner with the academic community on privacy in a way that's beneficial to everybody involved. So, you know, the folks from the university get to research on real-world data, write a paper, help advance the thesis, so on and so forth. And then, Uber gets some really cutting-edge technology, which advances the privacy mission that we otherwise might not have been able to invest in internally due to conflicting priorities, and so on. But instead, it's like, well, we'll do this partnership. and these researchers really want to work on it. We really like the potential, you know, outcomes from such a partnership, and maybe we can't internally fund a few full-time people working on this thing that's like theoretical for awhile. So overall, I think it was a great setup, and I would love to be able to do that more in future roles where we do these kinds of partnerships because it does allow this transition between kind of the theoretical in the privacy space into how do we actually make that thing go into production and what requirements and different tradeoffs do we have to make to make it a reality. So, it was definitely a favorite project of mine and I hope we can do some more in the future.
Debra Farber 34:11
That's awesome. I really love the story, and it really must have been fun to be there for like groundbreaking kind of...at the time, I think it was one of the like the second deployment after Google. I might be wrong about that, but it was pretty early on in using differential privacy in commercial practice. So that's pretty cool. I have to ask, what was it like to work on privacy at Twitter during an ownership transition like the one to Elon Musk. You know, do you have any lessons learned about that experience? And, maybe even just give us a little peek as to what it was like during a transfer of ownership?
Menotti Minutillo 34:46
It's okay. I knew you had to ask. Totally fine. No, I think any sort of big corporate event whether it's a you know, merger, sort of, you know - go private, go public, I've been part of it all. I think do during those times, it's...you know, things feel really volatile. There's generally...the business will try to keep things as stable as possible during these big corporate volatile events. And from, you know, my perspective, the way that I dealt with this particular one, and how I've dealt with with them in the past and other types of periods of volatility at these companies, is to focus on the mission and make sure that we are explaining the various tradeoffs that might have to be made when it comes to, you know, the potential for changing priorities or changing staff or reducing staff or moving people around because you know, different things become important. And then also, ultimately, making sure the folks who work for me, the folks who rely on me to be their Manager or Team Leader understand what the various risk factors are, what's important to focus on, and try to remember that, ultimately, we're here looking out for the end user. And, so long as we're able to do that and keep advocating for them, then we're doing a good job.
Menotti Minutillo 36:02
So, when I say focus on the mission, I don't mean sort of like, ignore all of the externalities. It's really more like take the input of those externalities and make sure that you're taking it into account when you're coming up with your strategy and making sure that your strategy is, you know, coherent with whatever the new potential reality is. And, I think that we did a pretty good job at that. And, you know, certainly it'll be sort of a story to tell and something to bring along with me for future roles. I've worked at a bunch of places which have gone through pretty interesting times when it comes to privacy; and so, this is another one that, you know, sort of helps give me a, I think, a pretty broad perspective on handling things.
Debra Farber 36:46
Yeah, it's definitely a unique one that had like the world's eyes on it. So, I don't envy you for that, but I'm sure you've got some amazing stories for around the water cooler - the proverbial water cooler.
Menotti Minutillo 36:57
Something like that.
Debra Farber 36:58
Yeah. Okay, so I'm going to completely switch topics here and talk about the fact that recently, you posted an article to LinkedIn that really caught my attention, titled "Was Privacy a Zero Interest Rate Bet?" And so, before we dive into what you mean here, can you tell us, what's a zero interest rate policy or "ZIRP," and how are tech companies leaning into ZIRPs (especially during the pandemic and after the pandemic)?
Menotti Minutillo 37:31
Yeah. And I'll be right upfront and say, you know, this is not...I wouldn't say it's my, my primary area of expertise. But, I'm definitely...I've always been interested in broader macroeconomic stuff. In fact, when I worked at Goldman, many, many years ago, the engineers were required to go through continuous training about financial markets and financial products; and I think I didn't really care much about economics when I was in college, but I got to kind of a crash course early on in my career. And so, I pay a lot of attention to it. So, I apologize for any sort of technical mistakes that any listener might hear me go over here when I'm talking about this. But basically, a zero interest rate policy is kind of the monetary policy that the U.S. Federal Reserve has enacted for like the better part of the last 14 years. Basically, the Fed has some primary responsibilities around inflation and supporting high employment. And, one of their main levers to do that is controlling what's called the 'federal funds target rate,' which by doing so it kind of guides banks in how they lend and what it costs to lend.
Menotti Minutillo 38:40
So, effectively by setting a rate to zero or near zero from the Fed side means that banks generally are guided towards lending money in an extremely cheap way due to a variety of factors that come out of the Fed setting that rate. So, what that means is that money is kind of cheap to borrow. So you know, if you think use the analogy of purchasing a house, you know, interest rates on a mortgage were historically low for the better part of the last 14 years due to that policy. And lately, we've seen...right before the pandemic, rates were starting to come back up a little bit to help control for the potential for inflation; but, when the pandemic started, the Fed returned to a zero interest rate policy to try to stimulate spending and stimulate borrowing because the economy was kind of experiencing this unprecedented shock wave. And so, by keeping money cheap, people are incentivized to borrow and spend and invest.
Menotti Minutillo 39:39
And so similarly, companies during low interest rates may look at kind of a relatively low-risk opportunity to borrow and then become kind of less stingy where they invest, and they may invest in more risky business opportunities or they may be able to spend more money in places that you know during a belt-tightening that they may not. So, that's kind of the zero interest rate policy in a nutshell; and I think...you know, we'll get into this a little bit, but there's this kind of meme now where you're trying to identify which investments that companies were making when money was cheap don't make as much sense anymore now that, you know, lending and borrowing is much more expensive due to due to rising interest rates. And, that was kind of the thrust of the article that I wrote, which was, you know, I'm observing the potential for privacy getting less investment in the kind of short to medium term due to the expense that might be incurred and whether this is a mistake for companies to think about it that way.
Debra Farber 40:40
That's based on what you're seeing in terms of layoffs for privacy engineers, or you want to tell us a little bit about what you're seeing affect privacy?
Menotti Minutillo 40:50
Yeah, I mean, on a personal level, I departed Twitter and, as I'm looking into what I'm going to do next, I'm seeing fewer roles, especially in the technical privacy space; whereas, there seems to be a back to basics for companies where they're still investing in it from a sort of counsel and policy perspective, which is generally where a lot of companies start. Like...and this is totally normal where companies will start their privacy journey as a legal exercise, which is completely expected. And, it's usually more of a starting point than where they mature into. So, based on kind of my own investigation into what I'm going to be doing next, you see a bit of a retraction in the technical privacy space; while, relative to that, you see more strength on the legal side. And, that's just been, you know, my observations. So, putting those pieces together, I see kind of a back-to-basics approach in that space, and it makes me a little bit worried that it's a step backwards for the practice overall. And, I was kind of posing this question as to whether the investments in technical privacy were a result of a kind of like a looser borrowing spending environment and if companies are looking at it that way - as something that is worth cutting back on in a significant way - that that would probably be a mistake. But, it's just a little less in vogue right now versus this rush to efficiency and this rush to doing more with less that you were seeing kind of broadly across the industry. Not just privacy, but in the tech industry overall.
Debra Farber 42:24
Which is interesting to me because I truly believe, like I mentioned before, that if you address privacy earlier in the system development software development lifecycle, then you are going to save money down the road on the compliance side. So, to ignore it, it just seems to me not only a step backwards, but just a giant mistake because you're losing trust, but for no added gain. Whereas, you could do the investment upfront for plenty of gains that would make shareholders happy later on. You know, I just wish we weren't so like tied to the market on a quarter-by-quarter basis.
Menotti Minutillo 42:56
Debra Farber 42:57
You know, but that's just wishful thinking.
Menotti Minutillo 42:59
Yeah. I mean, I think it is like incumbent upon privacy professionals to make it as understood as we can what that investment is getting you because certainly it's a little bit more of a transitive property of benefit than, "Okay, I'm going to hire a set of engineers to build some features that I know have direct revenue opportunity" - the sort of like connecting thread between those is very obvious. And I've never had a job in privacy where I wasn't spending, you know, material time making sure that folks understood what the benefit was. And I think, as privacy professionals, we have to make sure we don't kind of rest on our laurels in terms of "The organization's getting it, and gonna continue to get it without us trying to make sure we advocate for it and get really smart about tying things together," like I said earlier, to product success and to business success. And, it's incumbent upon us to do that. And especially during this time of belt-tightening, it's going to be extra important. You know, I'm not accusing any company of paying the inappropriate amount of attention to this. It's more of a I'm kind of like, maybe lightly-ringing an early bell on the fact that if you're pulling back here to really think about it in a more holistic way, rather than just kind of, you know, sort of quarterly dollars and cents aspect.
Debra Farber 44:22
Yeah, that makes sense. And so, in your article, you make three predictions about you know, if we continue to see this trend of eliminating privacy engineering roles or eliminating like investment in them, so not moving forward with them or just getting rid of them in an org altogether. You predict three separate effects. Do you want to unpack what they are?
Menotti Minutillo 44:44
Sure. I guess I went over this a little bit, but in some it would be that, I think, hiring and technical privacy roles will probably stay suppressed relative to the market. I think the market is going to snap back a little bit in the next few months. You know, we went through a kind of a chilly winter in hiring. There was a lot of freezes; there were layoffs. And, I think things are going to snap back a little bit because company efficiency remains really high, and earnings have been strong, and there's going to be some pressure for companies to make sure that they're not falling behind. But, I think on the privacy side, the hiring might not pick up alongside that in step-by-step because big privacy issues will, or they have at least stayed out of the zeitgeist for a bit.
Menotti Minutillo 45:31
And then, after that, there's going to be...there always is kind of a next big privacy issue, and it's going to seem like it was, you know, highly-predictable had the right eyes been on it and that it has the potential for harm. And, in the article, I mentioned, it would be in the AI space, which is just getting a tremendous amount of investment and rapid development in such a way that these companies are trying to out-rush each other to the market with solutions. And, we've already seen in the wild like lots of attacks on these generative AI products that...you know, people are going to start poking at them; it's going to be...that's going to be front and center, and we've already seen some issues as it regards to intellectual property, as it regards to people's personal data being fed into these models that are running these generative AI products.
Menotti Minutillo 46:17
So, there's going to be some big issues. The last privacy issue wasn't the last one. And so, as these start to resurface, I see technical privacy hiring picking up again in a strong way. It's going to initially be kind of a reactive measure, which is, you know, kind of how these things go right now. And, you see it a lot in security as well. And so, it'll be really important as that reaction happens for privacy professionals to make sure that they're bringing the value to the table that's not just reactive. Like, yes, you know, there's going to be kind of an immediate need for that work to happen and to respond to current issues; but, once you get into that seat, making sure that we're selling the value on a lot of the other stuff that we talked about that makes it a continued investment at the organizations who pick it up.
Debra Farber 47:05
Got it. Thanks for that. Now to keep up with what's new in the discipline of privacy engineering, what resources do you refer to; or, you know, what communities are you plugged into to stay up-to-date?
Menotti Minutillo 47:17
Yeah, for me, and privacy engineering, you mentioned it earlier, I love the PEPR conference. I love everything that USENIX does with Enigma. Most of the content is made public. You could watch the talks on YouTube. I believe a lot of times the slides are posted. So, even if, you know, you can't make the conferences live, that's usually giving you kind of a bleeding-edge look at what different companies are looking at in the area of privacy engineering. So, big plus one to what USENIX does, and especially the PEPR conference.
Menotti Minutillo 47:46
I'm spending more time in the Mastodon 'fediverse' right now, and a little less on Twitter. There seems to be a pretty big migration of folks in...especially in the security and privacy space in exploring the fediverse. It's definitely not ready for mainstream adoption; there's still a lot of kind of speed bumps getting on boarded. But, I imagine if you're listening to this podcast, you're either already there or you were thinking about it; and, there's lots of good conversation there. I've always found that staying up to date involves good conversation and good debate with people who are smarter than me, and there's many of them. So, I try to do that.
Menotti Minutillo 48:18
And, you know, there's lots of community that gets established in places like LinkedIn, where I'm posting a little bit more. You know, the article earlier, it's just putting my thoughts out there kind of broadly in the space has given me the opportunity to connect with other professionals to see what's going on and give me a broader perspective and give me a lot of respect for what's going on in the industry. So, those would be my go-tos. And then, maybe after that there's some good germinating spaces in...privacy Slacks and things like that. But usually, those are as a result of making good relationships with folks and industry, learning about what's going on, and then kind of getting into these, I think, more intimate communities...have been a really good way for me to keep in touch with what's going on.
Debra Farber 49:04
Awesome. Anything else you'd like to plug before we conclude today?
Menotti Minutillo 49:07
No, I think plugging PEPR was was my goal at the end. So, I'm glad I was able to do that.
Debra Farber 49:12
Menotti Minutillo 49:12
And I would just like to plug all of the...I guess just offer my support to all of the folks out there who've been affected by the recent layoffs. It's a pretty big deal in the industry; and, you know, having been through the big downturn in the financial sector, and, you know, 2007, 2008, 2009, this feels pretty monumental. It feels somewhat bigger, at least in terms of the scale of people that are affected. So, I just want to encourage everybody to keep on keeping on; and, there's gonna be tons of great work for y'all to do. And, that's what's keeping me going in my look for the next thing. And, there's lots of great professionals out there who can be supportive of your journey as well; and, I'm here for you and everybody else is. So, you know, keep it up and there'll be light at the end of the tunnel for those who've been affected.
Debra Farber 50:00
Absolutely. It's also a great time, you know, if you're affected to maybe start your own thing, whether it's a privacy tech company or maybe a privacy engineering consulting firm. Right? I mean, so I'd love to see some real innovation coming out of maybe these harder times. But, you know, we'll see. We'll see what...
Menotti Minutillo 50:18
I think. I think there's been a history of that, you know, during downturns and during, you know, shrinking kind of corporate environments. A lot of innovation comes out as folks go out on their own and give some things a try that they may have been thinking about. So, I think you'll see a bit of that. I'm excited for folks to take that step.
Debra Farber 50:34
Agreed. Same. Well, Menotti, thank you so much for joining us today on Shifting Privacy Left to discuss the role of privacy engineer in today's economic climate and for the future.
Menotti Minutillo 50:45
Thank you so much, Debra. And thank you for the podcast; it's really great, and I look forward to future episodes.
Debra Farber 50:52
Awesome. I definitely want to have you back in the future.
Menotti Minutillo 50:54
Thank you so much.
Debra Farber 50:55
Thanks for joining us today, everyone. Until next Tuesday when we'll be back with engaging content and another great guest.
Debra Farber 51:04
Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website shiftingprivacyleft.com where you can subscribe to updates so you'll never miss a show. While you're at it if you found this episode valuable, go ahead and share it with a friend. And, if you're an engineer who cares passionately about privacy, check out Privado: the developer friendly privacy platform and sponsor of the show. To learn more, go to Privado.ai. Be sure to tune in next Tuesday for a new episode. Bye for now.