The Shifting Privacy Left Podcast

S3E9: 'Building a Culture of Privacy & Achieving Compliance without Sacrificing Innovation' with Amaka Ibeji (Cruise)

Debra J. Farber / Amaka Ibeji Season 3 Episode 9

Today, I’m joined by Amaka Ibeji, Privacy Engineer at Cruise where she designs and implements robust privacy programs and controls. In this episode, we discuss Amaka's passion for creating a culture of privacy and compliance within organizations and engineering teams. Amaka also hosts the PALS Parlor Podcast, where she speaks to business leaders and peers about privacy, AI governance, leadership, and security and explains technical concepts in a digestible way. The podcast aims to enable business leaders to do more with their data and provides a way for the community to share knowledge with one other.

In our conversation, we touch on her career trajectory from security engineer to privacy engineer and the intersection of cybersecurity, privacy engineering, and AI governance. We highlight the importance of early engagement with various technical teams to enable innovation while still achieving privacy compliance. Amaka also shares the privacy-enhancing technologies (PETs) that she is most excited about, and she recommends resources for those who want to learn more about strategic privacy engineering. Amaka emphasizes that privacy is a systemic, 'wicked problem' and offers her tips for understanding and approaching it.

Topics Covered:

  • How Amaka's compliance-focused experience at Microsoft helped prepare her for her Privacy Engineering role at Cruise
  • Where privacy overlaps with the development of AI 
  • Advice for shifting privacy left to make privacy stretch beyond a compliance exercise
  • What works well and what doesn't when building a 'Culture of Privacy'
  • Privacy by Design approaches that make privacy & innovation a win-win rather than zero-sum game
  • Privacy Engineering trends that Amaka sees; and, the PETs about which she's most excited
  • Amaka's Privacy Engineering resource recommendations, including: 
    • Hoepman's "Privacy Design Strategies" book;
    • The LINDDUN Privacy Threat Modeling Framework; and
    • The PLOT4AI Framework
  • "The PALS Parlor Podcast," focused on Privacy Engineering, AI Governance, Leadership, & Security
    • Why Amaka launched the podcast;
    • Her intended audience; and
    • Topics that she plans to cover this year
  • The importance of collaboration; building a community of passionate privacy engineers, and addressing the systemic issue of privacy 

Guest Info & Resources:

Send us a text



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Amaka Ibeji:

Early engagement is about trust, and trust has to be earned and it could be a process. It all starts by saying, "We don't lead the conversation with technology. The conversation really starts with the problem we're trying to solve. For each conversation, we want to be sure that we are embedded, because I say, "Privacy engineers, we're multiple hat, we're always context switching. You talk to the HR team. The next moment you're talking to a data engineer. You're constantly context switching. So, one of the things we need to understand is, once we understand the problem we're trying to solve, the next thing is we begin to ask ourselves, "How can we solve for this? What techniques should we be using? And then we get to technology. I say, "For you to be invited over and over into the room early on in the conversation. You have to be a pleasure to work with. You need to know when to ask the right questions, you need to know when to listen, to absorb; because, at the end of the day, shifting privacy left is all about the win- win- win situation.

Debra J Farber:

Hello, I am Debra J Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans, and to prevent dystopia. Each week, we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding- edge of privacy research and emerging technologies, standards, business models, and ecosystems. Welcome everyone to Shifting Privacy Left. I'm your host and resident privacy guru, Debra J Farber.

Debra J Farber:

Today, I'm delighted to welcome my next guest, Amaka Ibeji, Privacy Engineer at Cruise. In this role, she architects and engineers robust privacy programs and controls, but it goes beyond implementation of rules and regulations. She strives to create a culture of privacy at Cruise. Amaka's interests span more than privacy. She's passionate about privacy engineering, AI governance, leadership, and security; and recently started her own podcast called PALS Parlor Podcast. PALS is an anagram for Privacy Engineering, AI governance, Leadership, and Security. Today I'm really excited to chat with Amaka about: her career; how organizations can achieve privacy compliance without sacrificing innovation; her new podcast; and then some trends that she's seeing in the world of privacy engineering. So a big welcome to you, Amaka.

Amaka Ibeji:

Thank you so much, Debra, for having me. I'm glad to be here.

Debra J Farber:

Excellent, excellent. Well, I think it makes sense to just start off with your privacy origin story. You started out in security and moved into privacy engineering, so tell us a little bit about what motivated that transition and how you went about making that change.

Amaka Ibeji:

That's a great question. I actually started out my career as a software engineer. I was really fascinated about leveraging technology for business process improvement. That fascination did not last long, as I became aware of cyber criminals who can exploit vulnerabilities in application, either to gain unauthorized access, corrupt the data, or steal information. I decided to explore the world of cybersecurity simply to understand the tools, techniques and mindsets or motivations of cyber criminals or, put it mildly, threat actors.

Amaka Ibeji:

My goal at the time was to learn enough to enable me to build robust applications, and I jokingly say it's been over a decade and I haven't made my way back. I know - rather, I have evolved my career into privacy engineering. So for me, beyond protecting IT infrastructure, going to humanizing the data we collect and process, it's beyond, you know, external threat actors to also looking into internal threat actors, both within the organization and those we choose to partner with. For me, you know, at the end of the day, when you look at my career, underpinning all of this is really about accelerating business objectives while advocating for users through respect for them and user experience. So, that has been my journey and I think every move has layered upon previous experiences. I'm excited and, of course, with the wake of AI, getting into AI governance is just such a delight, because privacy plays a critical role in the development of AI.

Debra J Farber:

Absolutely. Do you mind talking about some of those overlaps with AI?

Amaka Ibeji:

Yes, when you look at privacy engineering, one of the things you think about is that we're actually looking from the user's perspective. What are the experiences of the users? You could do that with traditional applications. AI comes with an amplification of this risk. Of course, it also comes with the benefit; so, we begin to look at issues around bias. How might that impact the user? You think about the data that is collected to train AI models. You begin to ask yourself what's in the data? How can we anonymize this data before it gets into a training data set? Those are some of the core questions we look at when we have AI-related applications to review. Those are the overlaps between privacy engineering, especially in the world of AI governance.

Debra J Farber:

That makes a lot of sense. I really like that customer obsession aspect of privacy, as well; and, you make really excellent points about the overlap. So, you're no stranger at all to innovation. I mean, I love your career because you started out doing a lot more around looking at technology and processes, and people and kind of moving into. . .security really doesn't look at the people's perspective as much as from a system perspective. And so, when you moved to Microsoft, you were in its Research and Incubations organization and you focused on embedding compliance by design - so a much broader mandate than just privacy - in strategy, partnerships, and research initiatives. Honestly, that sounds like a dream job to me, just for the perspective you must have had. I mean not dream job from the compliance sense, but from the mandate of what your job was. What did you learn from this compliance-focused experience that has helped you in your current privacy engineering role as you've moved on now to Cruise?

Amaka Ibeji:

I love the job at Microsoft and one of the reasons is, while I was at Microsoft, I reported to an amazing leader who had deep legal expertise and, although we were focused on privacy compliance, we found she was pulled into partnership conversations early. One of the things that afforded me the opportunity to do was to introduce early engagements.

Amaka Ibeji:

You know, drive the notion of early engagement even in partnership conversations, because for us to have a collaboration, Microsoft has the technology, but we'd really have to partner with organizations that have the data. Sometimes, this can be drawn out negotiations because from a compliance point of view, these partner organizations want to do right by their customers and the data that has been collected. However, when you enter into the world of privacy enhancing technologies, you can creatively start having conversation, construct of how to drive innovation while protecting these individuals. So, that in itself gave me the opportunity to see how partnership conversation can be accelerated, leveraging privacy enhancing technology. So, that was really a light bulb moment for me, really exciting, and I love every piece of it. That was where I truly cut my teeth on the conversation of shifting privacy left, taking it really early on in the conversation.

Debra J Farber:

I think that's awesome. That's great. You know, in a lot of organizations that discussion is still being had. Right? We're trying to make the change to shift left and it's great to hear. . . Microsoft's been pretty good at having shifted left into privacy for many years. I think it kind of had some challenges at one point, maybe 15 years ago - I'm ballparking it here - and then they responded to the market by hiring a lot of privacy folks, especially in technology. So, I think they've led the way a lot of the times on good governance when it comes to staffing, federated teams of privacy experts, like across the trust and safety and as they're building products and services. It's really great to hear that you've been able to apply what you've learned there in your next job. I think it especially now, it's perfect timing because this is when the privacy enhancing technologies are really scaling and not just new ideas coming out of research; they're really at the point of being able to deploy them in companies a lot easier. That's great.

Debra J Farber:

Compliance is important, but it doesn't always make for strong privacy or even effective security. Right? Sometimes, it could be looked at as just checking boxes to get through. Yeah, I did a requirement. Yeah, we did a check. You know we did a policy. So, what advice do you have for engineers who want to shift privacy left into design, architecture, engineering, and data science, to make privacy actually be impacted and not just a compliance exercise?

Amaka Ibeji:

That's a great question, and I like to start by setting the stage with the notion of 'best fit rather' than 'best practice,' because a lot of things we talk about is best practice. However, best practice informs best fit. When we talk about best fit, best fit starts with the notion of let's look internally. What are we doing? And get intimate with the problem we're trying to solve. I always say "the conversation should start from there what is the problem we're trying to solve? What goal are we trying to solve? What goal are we trying to achieve? And so, once we get there, especially for privacy engineers, we have to get to the point where we're being invited into the room early, and I say it very often that early engagement is about trust, and trust has to be earned, and it could be a process.

Amaka Ibeji:

It all starts by saying, "We don't lead the conversation with technology. The conversation really starts with the problem we're trying to solve. For each conversation, we want to be sure that we are embedded, because I say privacy engineers, we're multiple hat. We're always context switching. You talk to the HR team, the next moment you're talking to a data engineer. You're constantly context switching, and so one of the things we need to understand is, once we understand the problem we're trying to solve, the next thing is we begin to ask ourselves how can we solve for this, what techniques should we be using? And then we get to technology.

Amaka Ibeji:

And I say, "for you to be invited over and over into the room early on in the conversation, you have to be a pleasure to work with. You need to know when to ask the right questions. You need to know when to listen, to absorb, because at the end of the day, shifting privacy left is all about the win-win-win situation. The team has an accelerated path because they get insight of what to do early. You also buy time as a privacy engineer to do deep research, depending on the problem you're trying to solve. Ultimately, the business and end users enjoy the benefits. I always say, "A well-designed solution should be very seamless, easy and technology to take the backstage - it shouldn't even be seen. So, it's an experience that the users would long for and appreciate. That's going to be my advice.

Debra J Farber:

I think that's great advice. It's a great jumping off point then to now turn to the next conversation, which is about culture. So, you're passionate about creating a culture of privacy within an organization and within engineering teams. One of the hardest things to do in a company is to change its culture. I know because I've read lots of articles on it. I know because I've tried to change a culture before, even in a much smaller company than Microsoft. So, I'd definitely like to hear about your successes and failures, like what has worked well and what has not worked well when trying to cultivate a culture of privacy.

Amaka Ibeji:

That will take me back to my days in consulting. You know, I had some experience in Deloitte as a consultant. I would say that first of all, as a privacy engineer, you get to wear many hats. One of them is as a coach, and the reason I bring this up in relation to culture is, while we have policies and procedures, my experience from working in consulting and observing first-hand the culture of organization is that people like simplicity. Anything that enables them to achieve their goals, they will go for it. So, that being said, our policy documents by themselves do not drive culture change. People understanding what can go wrong begins the shit. And when I say as a privacy engineer, one of the hats you should constantly wear is that of a coach; for every interaction that you have, it is a coaching moment. What you do is what are you trying to achieve? You get embedded, but you're coaching to leading people to understand what can go wrong, and that starts the conversation. However, you see that the conversation of what can go wrong also has its limitation. I think a better narrative is: "how can we unlock more with a modified approach?" That is where the conversation begins to get exciting. While we have this policy,

Amaka Ibeji:

I always say, "Policy documentation should be reference materials." In the everyday lifestyle, our approach, I always say that our practices must match our policies. In the practice, we need to understand: what can go wrong; what can we do better to unlock more; leveraging, improve, enhance approach. That's why I talked about best fit. So, it's not about we need to do X Y Z in terms of compliance and this is best practice. By the way, I always see compliance as the minimum step. We always need to go above and beyond and say, To achieve more, to achieve this, what do we need to do?" And one of the ways I like to do this is when I get into a room and the business team tells me, "this is what we are set to achieve.

Amaka Ibeji:

I ask the question "What else? What else are you looking to achieve? Because that begins the real conversation. Everybody has something pretty to say about their goals, but sometimes there might be embedded notions that you're not saying out loud. You need to be able to read the room and ask "What are we truly trying to achieve? Once you get to the heart of the matter, that is where change starts from; and, once they can trust you enough to let you in on what they are trying to achieve, You can begin the conversation of doing your deep research to ensuring that we can achieve this within the guardrails that you provide for them. At the end of the day, I think you have, through partnership. . .and that's when they begin to think about you even early on from a conceptualization phase, because you bring value to the conversation.

Debra J Farber:

I think that's really great advice. I really do so. Following up on that, as you're bringing innovative products and services to market that bring value to customers, that could even be internal customers, what are some approaches that you've seen to make privacy and innovation a win-win rather than zero-sum game? Right? Obviously, this is privacy- by- design language here. What are some of the ways that you would suggest that we could do that?

Amaka Ibeji:

It's clear. I always say that humans are very innovative and creative. I don't always go into a room with the notion of going there to be the solution provider. I lead the conversation. The reason I say that is the same creative ability people use in bypassing controls, they can still channel that creative approach to complying with these controls if they see the bigger picture, if they see how much more we can achieve together. And so, when I get into the conversation - especially from looking at. . .of course, I have my privacy by design principles in hand - I'm not just going to read that out; it's going to be embedded in my conversation. It's going to be embedded in how I lead the team. When I say you should go in as wearing the hat of the coach, when you ask those deep questions, it's not just asking the question, I'm leading the team on a journey.

Amaka Ibeji:

When we talk about things like privacy by default we want to understand, "Have we accounted for the users that will be using this application or this product? That's the first question. Do we have outliers that we may not be accounting for? Because we can be in a room where we're accounting for 90% of the users who will be using this application. "What if there are outliers? How do we account for what might be of interest to them? When we have this conversation, we come back to the drawing board and say, "This is what we know as a date. This is how we're going to design the product.

Amaka Ibeji:

However, we will make this product customizable so that if there is a population we're not accounting for, we're empowering them to make adjustment as they see fit." So it's all about coming in with an open mindset to design, but also with some notion of understanding who the typical user will be and also thinking about who might the outliers be for this application and what is important to them. One more thing - we might know people's interest as of today. Context changes over time for even the typical user. We should also ask ourselves when their contexts change, how are we empowering them to ensure that this product still serves their need at every time? That is the heart of the conversation and that is the heart of designing the product that goes to market. That's where I come in from and that's the kind of conversation I love to lead.

Debra J Farber:

I think that is what we need more of, for sure, and I think we need more people. I mean, it's part of culture change to understand that the privacy professionals, especially privacy engineers, are there to empower them to make choices that are going to meet the customer obsession. We're not here to detract from a product or say no. We're here to help companies achieve their innovation while still meeting the needs of a broad swath of customers, and not just a main persona, but the edge cases and thinking about the future, too. I like what you said about making it customizable in case there's something you hadn't considered in the future. Awesome. So, are you noticing right now - and this is a broad question to you - are you noticing any trends in the privacy engineering space? This could be broad or this could be applicable to a particular product you're thinking of but that stand out to you in the world of privacy engineering?

Amaka Ibeji:

I mean, I think we've talked about this even on this call, but I would say it again that, yes, the growth of privacy enhancing technologies, that is one area that excites me, and the number of players in the space is one to watch out for, because this is becoming, we're beginning to see productization of privacy enhancing technologies. So, I think this is the right time for privacy engineers not just to look out for the products coming but to understand, corely, the concept; because, at the end of the day, to design a solution, it's not just one- size- fits- all.

Amaka Ibeji:

You may have to do some combination of those. So, understanding the core concept of privacy enhancing technologies and look out for the vendors in the space, the players in the space, is amazing. In fact, last night I was reading an article. It's the impact of PETs on business, individuals, and society; and, one thing that caught my attention there's really a quote from the article that says, The PETs market is expected to reach a value of $25.8 billion by 2033." There is a lot of movement in that space and, to me, I'm truly excited and really watching that space very closely.

Debra J Farber:

I'm just as excited as well. I think there's a lot there. I've covered a lot on this program around privacy enhancing technologies used in the data science space. That's clearly unlocking the ability to use data that previously has been considered like plutonium - you got to lock it down and can't use this data because it would violate privacy. Right? But these privacy enhancing technologies would enable the use of that data for analytics while preserving privacy. But there's also other use cases, right? There's privacy enhancing technologies used to prevent confidentiality leaks. There's, you know, maybe in testing - of your product testing - and stuff along those lines for masking. What are some of the privacy enhancing technologies you're most excited about?

Amaka Ibeji:

A ton of them. I, in fact, for this, I just posted something on LinkedIn that I'll be talking about privacy enhancing technologies in the days to come. Differential privacy is exciting. Releasing analytics in a way that cannot potentially identify an individual in the data set - that's exciting from a data release point of view, especially when we're looking at queries.

Amaka Ibeji:

Another exciting one for me is with regards to homomorphic encryption and the way that it's designed, where we could drive encrypted query into a secure data set without having to reveal this data set to external actors, but enabling them to get the insight that they want. You know we've seen a lot of these across organizations, so different organizations coming together. You talk about law enforcement maybe wanting to query a financial database owned by a bank and so they send, probably, a query into there. . .to be able to pick up an activity of one user. So, rather than letting the financial institution into the insight that I'm looking to find out information of Bob, for example; because that is also a privacy leak. They could encrypt the query of Bob and search into the financial institution's database to return the query back to the search party in an encrypted manner until they're able to get the insights that they want.

Amaka Ibeji:

Now, this is also something that we should also consider internally. For example, if there's a litigation and employment legal wants to query a data set, they shouldn't have to pass the name to an analytics team or a product owner to be able to get insight of that employee. They should probably be able to perform an encrypted query, get the insight back to employment legal, and then that piece of information stays with them, instead of saying, "Hey, can I get the information of Bob, because that's already a leak that something is going on with Bob. So, this is really exciting, not just across organizations, but also to think about it internally within organizations. You think about employee survey. Currently, we do a lot of confidentiality around employee survey. How about we take away the trust component and ensure that we can get deep analytics without revealing who does what? So there's a lot of exciting notion that we could even take beyond cross organizations, but even within organizations, especially for organizations that are large and are scaling.

Debra J Farber:

I think that's great and those are some real key examples. I appreciate that. Besides our podcast shows, and we're going to talk about yours in the next question, what are other resources - so books, newsletters, slack communities, you know anything like that - that you'd recommend to those interested in privacy engineering?

Amaka Ibeji:

Yeah. You know, I started with a quote. I don't even know where I heard this, but it says "a professional never gets bored with the principles. So I'd say, start with a privacy by design principles. Have them close to heart and let it mean something for you, even in your design approach. One book that I really like, and I think it's free, it's "Privacy Design Strategies by Japp-Henk Hoepman.

Amaka Ibeji:

I hope I'm pronouncing that correctly. That's an amazing resource, and also this is the right time to begin exploring privacy enhancing technologies. There are a ton of materials out there, and for me, one of the commitments I'm also making is also to really post on LinkedIn fairly daily to talk about these concepts, as I encounter them, to drive quality conversation. Let's understand the challenges and also empower those who are coming behind us to take this and build robust application. At the end of the day, we do not have any organization in silo. We are part of an ecosystem, and enabling others across different organizations to develop robust applications is as important as the applications or the products that we build within our organization. So, amplifying this work is something that I'm taking up as a challenge, and it's been very inspiring coming on this journey so far.

Debra J Farber:

That's awesome and I'll put a link to the Hoepman's Privacy Design Strategies book. I also want to do a call out for Jason Cronk's book, "Strategic Privacy by Design, which really takes Hoepman's Privacy Design Strategies - which gives you all of these approaches and tactics based on your use cases for, you know, are you going to minimize, are you going to separate? How do you address the privacy challenge from a strategic perspective as you're designing out your products and systems. Jason does an amazing job in his book "Strategic Privacy by Design, Second Edition," of really educating and taking that work, and the work of other academics, to make it practical and how you can actually approach this in your organization. So, I'll add a link to both of those in the show notes.

Amaka Ibeji:

Maybe I can add two resources here that would be very helpful. While I was speaking, I talked about what can go wrong, and addressing what can go wrong is really understanding the privacy threat landscape. One very amazing resource is the LINDDUN framework. The other one, as we begin to step into the world of AI, is also to look at the PLOT4AI. You know, which questions that we could use in driving these conversations and understanding what kind of answers are we getting and what can we do with those answers. So I think those two resources will be amazing and I can send the link after this.

Debra J Farber:

That'd be great. That'd be great, and those are two topics we've covered in detail on the show. We've had Isabel Barbara talk about Plot4AI, which was kind of an extension of the LINDDUN Framework but applied to AI systems, so it's a little broader than privacy. And then, the LINDDUN Framework - we've had Kim Wuyts on, who was a real driver of that before she moved on to, I think, at PwC now. She's a real driver of that academic work, and LINDDUN now is the premier threat modeling for privacy framework. Absolutely, I think everybody should go check that out. Those are really two great resources. So thank you for highlighting them. Let's talk about you. Let's talk about your new show, the PALS Parlor Podcast. What motivated you to launch this and who is it for?

Amaka Ibeji:

It's all about amplifying the conversation we're having. When I look at PALS, PALS is an acronym that pulls together my experience and interest over time: Privacy, Engineering, AI Governance, Leadership, and Security. And again, it was exciting that I could come up with these acronym because when I speak to folks, I always want to speak to people as pals, as friends, and amplify this conversation. When I launched the PALS podcast - the PALS Parlor Podcast - one of the audiences that I had in mind was really to speak to business leaders; taking technical concepts, breaking it down for business leaders to really engage and see what they can unlock within their organization, enabling business leaders to do more with the data that they have within, you know, reasonable guardrails. That was my intent. The second set of folks is peers, you know, pioneers in this space. How can we come together to ensure that we're challenging our thoughts, we're improving in our thinking, and come together to even bring about more creative techniques that we, in isolation, haven't thought about? So, that was really my motivation for getting started with the podcast.

Debra J Farber:

Well, that's awesome and, as one of your peers, I'm delighted to collaborate with you in the future on things. I think there are a few privacy engineering- focused podcasts and my goal, too, with this show, with Shifting Privacy Left, is to help build community and help get followership for the desire of serving the community, not for any personal desires larger than that. I want to service the community of privacy engineers and help it grow and be an evangelist. I just think that that's a perfect role for me, just amplifies all you know who is Debra. Debra's an evangelist for privacy engineering. Right? I think that's great.

Debra J Farber:

I think it's a perfect place to come from with this desire to help, not only share knowledge, but a platform and a way for people to come together and want to combine efforts to really make impact in our own organizations, but then also the industry at large. So that's awesome. What has your experience been so far and how many episodes have you recorded and published and what's the future? Where do you see this going for the rest of the year?

Amaka Ibeji:

So far, we have four episodes out and a combination of solo podcast and also having guests on the show. When I talk about the PALS Parlor, growing up we always called the living room the parlor where we have guests; but, it's also a place where family members, friends also meet and have the conversation. So, there will be a combination of me talking, you know, solo about concepts and also bringing in peers and pioneers in the space to have the conversation. We will be continuing on that momentum for the rest of the year and just hoping that we can answer deep questions that people have, because it's also a reaction of the questions that I get from my LinkedIn daily posts. One of the top ones is people within security wanting to do more in the privacy space and asking what resources do they need to make that leap. So those are the sort of questions we will be answering, while also addressing some concepts within the privacy space as well.

Debra J Farber:

I think that's great, especially since we were just talking about the LINDDUN Threat Model Framework, and most threat modelers out there are in security. I mean, that's really where it kind of grew up. So, from threat modeling then you get red teaming and pen testing and all the different tests that you can do, and so it'd be great to really educate the security community at large, especially those that have this threat actor. . .basically, hacker mindset, on how can you exploit so that you can prevent that? Right? My fiancé is a hacker himself. [Amaka: Oh, wow!] Yeah, yeah! So, he works on bug bounty programs and such, and so I totally see the value of getting a lot of security folks to really understand," Hey, there are not only now threats coming from AI, but there's this whole field of privacy, where I think our challenge in this, in getting the hackers to start thinking about threat modeling for privacy is that there really hasn't been a cohesive set of " Top 10 vulnerabilities for privacy. Right? They've been lofty goals like you don't want a breach; they're not things you can really test to at a technical level.

Debra J Farber:

So, I do know that there are some folks that are thinking about maybe working with OWASP or thinking about how could we get a set of vulnerabilities that are truly privacy vulnerabilities that we can then have as a community to vet for, and threat model for. I think LINDDUN's a great place to start. It really is kind of a vulnerability model that allows for threat modeling, but you could see what's the result of that threat being exploited. Oh okay, this is the problem. Right? It really is a vulnerability standard as well, and so it'd be great to kind of work backwards on what are those specific things that we could test for? Do you have any thoughts around that? Have you seen anything there or thinking about that?

Amaka Ibeji:

I like that you explained it that way, because one of my goals is really to build a community of boundary spanners, because we always say that you cannot have privacy without security. We want a situation where we have boundary spanners, people who understand the security domain but also have a rich understanding of privacy domain, especially from the threat actors. We don't want people who are silo thinkers. Of course, silo thinking enables expertise, but we need boundary spanners who can understand the big picture; and that's why we're driving this conversation to reach people that not necessarily call themselves privacy engineers but also understand the deep concepts and appreciate all of that. I think, with the work that you had just mentioned around the OWASP Top 10 if we could have something similar in privacy or a combination.

Amaka Ibeji:

I always lean towards an integrated approach. That's always very important. So, we save time; we make it seamless; we want to do a review; we have the security folks, privacy folks in the room having this conversation and no one is lost in the conversation because there's an appreciation of the threat actors. We're context- switching from external threat actors to internal threat actors and there's a respect for that across board.

Debra J Farber:

That is true. That is awesome. I love that. And, again, any way that we could collaborate or get others listening to this to kind of collaborate on that effort, I think the better the industry would be. So, speaking about collaboration, what's the best way for folks to reach out to you?

Amaka Ibeji:

To reach out to me, it's on my LinkedIn page. I say I respond faster to LinkedIn DMs than I do to my email.

Debra J Farber:

So do I! Oh, that's awesome; and, I do want to point out, you are, on a daily basis, you've got this goal of posting at least one educational privacy engineering post. Right? Do you want to talk a little bit about that?

Amaka Ibeji:

Yes, I thematically think through what I post. For example, we talked around the LINDDUN framework for about a week before we moved into Plot4AI, like you rightly said, because the Plot4AI was building on the LINDDUN Framework. So, I thematically post this, and I try to do it daily so that it's bite-sized. For this incoming week, I'm looking around privacy enhancing technology. Then there will be a sequel to that. For example, we could talk about how privacy enhancing technology can impact data breaches.

Amaka Ibeji:

With the SEC notification, disclosure notification - that's something that's a hot topic I'm also interested in; and, also partnering with board cybersecurity to also look at how can we see the disclosures, the 8Ks and the 10Ks? How can we begin to look at it and what does it even mean for Chief Privacy Officers and Privacy Engineers. So, those are areas we're going to be unpacking in the future and also looking at risk management. What does it mean having found this risk? What is risk response supposed to look like so that privacy engineers are not overwhelmed by finding risk and not knowing what next to do afterwards? So, thematically, I like to put my post in a thematic way that makes sense and people can always go back and follow through the sequel. So, yes, I'm really excited about those.

Debra J Farber:

That's really cool. Anything else that you'd like to plug or say to this audience before we close? And besides, everybody, go check out the PALS Parlor Podcast.

Amaka Ibeji:

Yes, I think I'd like to close with something I learned from Brad Lee (from Privatus Consulting) "Privacy is a wicked problem. It's a systemic problem, so we need system thinking to address privacy problems, and so when we think about it, we need to ensure that we're thinking from a system mindset and building robust application. One way to address system problem is by collaborating deeply, bringing relevant stakeholders to the table, and also ensuring that we get to a point where the solutions that we come up with suffices all of the resources and the constraints that we have at that time to get into a good enough solution. I think that will be awesome.

Debra J Farber:

That's excellent. Amaka, thank you so much for joining us today on The Shifting Privacy Left Podcast. I think your passion for the space and your enthusiasm really shines through and I'm excited to listen to the PALs Parlor podcast, hear more of your wisdom, and to actually meet you in person someday soon. I mean, we're both in the same state, in Washington state; and, you know, I hope our paths cross in person because I think your passion matches mine and I'd love to collaborate somehow.

Amaka Ibeji:

Thanks, Debra, for having me. This was so much fun and a pleasure to be on your show.

Debra J Farber:

Oh, thank you. Until next Tuesday, everyone, when we'll be back with engaging content and another great guest. Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website: shiftingprivacyleft. com, where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend. And, if you're an engineer who cares passionately about privacy, check out Privado: the developer-friendly privacy platform and sponsor of this show. To learn more, go to privado. ai. Be sure to tune in next Tuesday for a new episode. Bye for now.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The AI Fundamentalists Artwork

The AI Fundamentalists

Dr. Andrew Clark & Sid Mangalik
She Said Privacy/He Said Security Artwork

She Said Privacy/He Said Security

Jodi and Justin Daniels
Privacy Abbreviated Artwork

Privacy Abbreviated

BBB National Programs
Data Mesh Radio Artwork

Data Mesh Radio

Data as a Product Podcast Network
Luiza's Podcast Artwork

Luiza's Podcast

Luiza Jarovsky