The Shifting Privacy Left Podcast

S2E38: "PrivacyGPT: Bringing an AI Privacy Startup to Market" with Nabanita De (Privacy License)

December 19, 2023 Debra J. Farber / Nabanita De Season 2 Episode 38
The Shifting Privacy Left Podcast
S2E38: "PrivacyGPT: Bringing an AI Privacy Startup to Market" with Nabanita De (Privacy License)
Show Notes Transcript Chapter Markers

My guest this week is Nabanita De, Software Engineer, Serial Entrepreneur, and Founder & CEO at Privacy License where she's on a mission to transform the AI landscape. In this episode, we discuss Nabanita's transition from Engineering Manager at Remitly to startup founder; what she's learned from her experience at Antler's accelerator program, her first product to market: PrivacyGPT and her work to educate Privacy Champions. 

Topics Covered:

  • Nabanita’s origin story, from conducting AI research at Microsoft as an intern all the way to founding Privacy License
  • How Privacy License supports enterprises entering the global market while protecting privacy as a human right
  • A comparison between Nabanita's experience as a corporate role as Privacy Engineering Manager at Remitly versus her entrepreneurial role as Founder-in-Residence at Antler
  • How PrivacyGPT, a Chrome browser plugin, empowers people to use ChatGPT with added privacy protections and without compromising data privacy standards by redacting sensitive and personal data before sending to ChatGPT
  • NLP techniques that Nabanita leveraged to build out PrivacyGPT, including: 'regular expressions,' 'parts of speech tagging,' & 'name entity recognition'
  • How PrivacyGPT can be used to protect privacy across nearly all languages, even where a user has no Internet connection
  • How to use Product Hunt to gain visibility around a newly-launched product; and whether it's easier to raise a financial round in the AI space right now
  • Nabanita’s advice for software engineers who might found a privacy or AI startup in the near future
  • Why Nabanita created a Privacy Champions Program; and how it provides (non)-privacy folks with recommendations to prioritize privacy within their organizations
  • How to sign up for PrivacyGPT’s paid pilot app, connect with Nabanita to collaborate, or subscribe to "Nabanita's Moonshots Newsletter" on LinkedIn


Resources Mentioned:


Guest Info:



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Nabanita De:

I think knowing the space is very helpful in talking to customers. So, like you mentioned, if somebody has a personal story there that they felt a need of a problem and then they're solving that, I think it becomes so much easier when you're interacting with, maybe other people in the space, because you intuitively understand what they're saying, what kind of problems they have. You can ask better questions in your product discovery calls. So, I would say that's an edge that engineers in this space who already have done some privacy AI work would have when they think about starting their startups, because, first of all, they have the network to tap into and talk to. Second of all, they intuitively understand how these different systems work - what kind of issues they have seen. Overall, I think tying that to your personal mission, tying that to what you're trying to solve and then building that out, I would say that's also an excellent point that you bring up. That would be a great start as well.

Debra J Farber:

Hello, I am Debra J Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans and to prevent dystopia. Each week, we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models, and ecosystems. Welcome everyone to Shifting Privacy Left. I'm your host and resident privacy guru, Debra J Farber. Today, I'm delighted to welcome my next guest: Nabanita De, software engineer, serial entrepreneur, and now Founder and CEO at Privacy License.

Debra J Farber:

This woman is just so impressive!

Debra J Farber:

In addition to a successful career trajectory, having worked previously in engineering at Microsoft, Uber, and Remitly, Nabanita is a powerhouse leveraging technology to solve some major world problems. She founded the Nabanita DeF oundation. org, which assists people in making a return to work after a career break in 102 countries with $0 spent marketing dollars, where she was awarded a Fast Company World Changing Idea Honoree Award. She also founded Covid H elp for India. com, which streamlined COVID-19 resources into one location with a 35 country reach, and she also previously founded FiB in 2016, an open source fake news detecting Chrome extension for Facebook, which earned her the Google Moonshot Prize at a Princeton University hackathon. She's also just won so many hackathons I couldn't even fathom to list them all here, and today we're going to discuss Nabanita's new startup, Privacy License, whose mission is to transform the AI landscape by seamlessly integrating privacy- by- design and AI governance into data systems. Its first product to market is Privacy GPT, which we'll talk about in detail, and the need to educate and grow more Privacy Champions is something that we'll be talking about as well.

Nabanita De:

Hi Debra! Thank you so much for this wonderful introduction.

Debra J Farber:

Absolutely. I have so many questions for you. You've done so many things in your career and you're still - I mean, in my opinion, compared to me, you know, a young, professional with so much ahead of you. So, to kick things off, can you give us a little overview of your origin story and how your life experiences led you to today, where you found an AI privacy startup?

Nabanita De:

Yeah, absolutely. I think the good way to describe it would be like privacy found me instead of the other way around. So, I actually started at Microsoft, where I was a Software Engineer and also AI Researcher. I did AI research at Microsoft Research and I actually built a project which leads to inferring location from text without any location indicators in it, and I thought that was my first step towards like something in privacy. And then, I joined as a Software Engineer at the Redmond campus at Microsoft where GDPR had just come out around that time in 2018.

Nabanita De:

I was doing the end to end data lakes and machine learning and moving terabytes of data from on-prem to cloud, and I took part in some data anonymization and tagging and finding data for these sorts in big data lakes within Microsoft at that time.

Nabanita De:

I then moved to Uber where, as a Security Engineer, I sat in the Cloud Security team, but I collaborated directly with the Privacy Engineering team and led some company by data classification and DLP efforts to ensure that Uber's cloud infrastructures are compliant with privacy laws like GDPR and CCPA. I then moved to Remitly as an Engineering Manager for Privacy Program and Privacy Engineering and led all of that stuff. And then I feel like, working across all of these different companies of different scales and different roles, I realized that there are common challenges that companies of different sizes and in different roles are trying to solve; and yet, there's no one particular solution to build that. And that's where I was like oh, I do have a masters in AI; I have all of this privacy experience. I should absolutely be the right person to be building a privacy and AI startup.

Debra J Farber:

I love it. I love it. Okay, so tell us about Privacy License. What's its mission? What are your future plans with the organization?

Nabanita De:

Yeah, I think my overall goal is to ensure that, with all of these different laws and regulations that are coming out today - I think that more than 130 countries have their own data production laws and there are over 160 different privacy laws - which means that if companies want to enter global markets, they would have to have some sort of privacy license to operate there, which means complying with those laws and building out appropriate privacy systems internally to facilitate that. That's where the mission is to empower these companies - to not only enter these global markets but also ensure that privacy is something that's protected as a human right for their consumers and employees.

Debra J Farber:

So, it's both aimed at consumers. . . that you'd be selling to consumers as well as to enterprises?

Nabanita De:

The initial goal is to sell just to enterprises, but in process of doing so, I think consumers play a huge aspect of interacting in the space of privacy by, for example, requesting for their data in form of Data Subject Requests or understanding how they can incorporate privacy. So, it would have the educational aspect of it as well, but our primary focus is to sell to enterprises.

Debra J Farber:

Yeah, that makes sense and I do want to point out to the audience, from what I know, from what talking with you, that Privacy License, you plan to have multiple products under the brand of, or the the organization of Privacy License, with the first product that came out being Privacy GPT, which we'll get to in a little bit. That's my understanding. Correct? You plan to add different apps and tools?

Nabanita De:

Yes, absolutely. I'm planning to build an ecosystem of privacy apps that would be part of that platform, which companies can leverage for different things that they would be working on. For example, to serve these apps appropriately, we'll be building a tool to do that. Then, we'll be looking into consent management - and all of those stuff are part of the roadmap. But, our first entry point into the market was building out this Privacy GPT, like you mentioned, a privacy firewall for Chat GPT. So, once you download this Chrome extension, you go on Chat GPT and you type something, it will redact the sensitive information (i. e., PCI, PHI, PII) from 62 countries without the data leaving your browser, ever. So, yeah, you can use Chat GPT without worrying about multiple privacy concerns.

Debra J Farber:

Yeah, I love it and we're going to definitely deep dive into that. I did want to ask you first, though, so that we don't get too deep into talking about Privacy GPT without me asking, but what was it like switching from an Engineering Manager role focused on Privacy Program Management as well as Privacy Engineering, but your Engineering Manager role at Remitly to now building a startup? That's got to be different; the external pressures, the demand from investors, the excitement of building something new. Tell me a little bit about what it's been like for you.

Nabanita De:

Yeah, actually, you know, surprisingly it was very similar, I would say, because I think when I joined at Remitly I was sort of doing the zero- to- one stage for technical privacy and then, I think, having those experiences of like.

Nabanita De:

so, my role was unique in the way that I sat between our Legal / GRC and our Technical Teams across the organization and sort of translating between Legal / Regulatory to tech roadmaps and then back from tech people back to like the Legal and the GRC folks.

Nabanita De:

I think sort of having those experiences and going into the startup- land. One thing I did at Remitly is sort of understand jobs to be done across the organization, and you know what machine learning teams, finance teams, IT or like - what different teams want out of privacy when I had joined Remitly initially. I think very similar framework when I went into the startup, I think the first step is like finding Product-M arket Fit.

Nabanita De:

You sort of go into the industry. You talk to a lot of different people. You try to understand what are these jobs to be done that ideally will solve the big challenges in privacy? What are these bigger drivers, and sort of having those conversations. I feel like it was very similar, so I did not feel like like I was doing something different. As part of being at Antler, it was an excellent experience. It's very similar to your traditional accelerators like YC or Tech Stars, but Antler is like pre-idea, pre-seed, pre-money, so they are like zero- day funding agents [Debra: Let's say " day one" instead of "Zero day, just for all the security implications] Yes, day one, day one accelerator.

Nabanita De:

So, I guess it was great, in a way, that I knew the privacy side of stuff, but maybe I still learned a lot about the business side of things. Like how do you think about a business plan? How do you write a deck? How do you think about privacy, maybe from a first- principles approach? How do you think about go-to-market? How do you think about building, like pipelines and consumers, and how do you basically do this process?

Nabanita De:

From a business standpoint, I think that was something I had done. I never had like this formal coaching before. In the previous startups that I've been successful at, I sort of tried different things and saw what worked and what didn't. It wasn't like I knew exactly what needed to happen versus here. I think, in hindsight, I could look back and be like, "oOh, I did this one thing I remember in my previous startup, but essentially this is how successful entrepreneurs do XYZ things. So I think that way it was like very helpful to like be in that program and have like excellent mentors guide you constantly.

Debra J Farber:

That's awesome. I hope to see in the future some of these prestigious accelerator programs, you know, maybe focused on privacy or privacy in AI or, you know, as opposed to just startups generally, so you could just kind of pump out more ethical tech. I don't have that money to invest, but I'm hoping somebody out there does in the near future because there's definitely need for it. I'm so glad that you've got that coaching. Now, let's talk about your first product to market. Let's talk about Privacy GPT - a super sexy name because, you know, obviously Chat GPT is still all the rage in the public consciousness around generative AI. You described it earlier as a "privacy firewall for ChatGPT prompts."

Nabanita De:

Yeah, I think Chat GPT in the last year, I think they've grown to over 100 million users. Every day there are exabytes of data just in prompts that are being sent into Chat GPT that you know people are adding to it. But, when you think about it from a privacy standpoint, there are so many different concerns. Right? For example, you're a doctor, maybe, and you're typing. . . like in the sense, like maybe you're using Chat GPT to get something quickly, but in the process there might be some patient information you're typing into it. You can imagine, from a HIPAA standpoint, that's not something you're supposed to do. You could be somebody sitting in a consultancy and you're trying to summarize a big doc that you're trying to write, and in that process you might have given away some of your employee confidential data into Chat GPT which, again - based on privacy laws - you're not supposed to use anything which you don't have consent for. There are so many privacy- related concerns when the way people are using, you know, the use cases into Chat PT being an excellent, like productivity tool.

Nabanita De:

So, by default, so many companies have, you know, banned chat GPT within their organizations because they haven't figured out a right way to make this utility tool to work. So my thought process there was that I do want people to use tools like ChatGPT or other LLMs that are out there; but can I empower them so that while they're using this tool, they do not compromise the data privacy standards that are required to completely use this tool properly? So, that's where I sort of built out this Privacy GPT which, like I mentioned, Redax, PCI, PII, PHI for free at this time from 62 different countries. And, also secrets in code, so that way people can use, try it out and use Chat GPT with a little bit more privacy protection than they had before. Then, we are also releasing a paid pilot which will have much more advanced technologies in it to redact sensitive information while maintaining context, and then have the companies across the world maybe use this tool to empower their employees to use productivity tools without compromising privacy.

Debra J Farber:

It's a noble effort for sure. I love it and, since this is a technical audience of privacy engineers here, can you describe a little bit about how Privacy GPT works in practice? How did you architect this?

Nabanita De:

Yeah. So, the way right now it works is it entirely sits on the browser side for the user. It's a Chrome plugin. Once you download it from the Chrome store, like Privacy GPT, and you add it to Chrome and you go to chatopenai. com and you type something and when you click on "hit send, it doesn't send to chat GPT, it first sends to the browser plugin that you have downloaded, privacy GPT, which is sitting on your browser, and then it basically we have built out like a state of the art algorithm to like redact - like detect these sensitive information - and then redact that on the browser- side of stuff and then also give people the option to redact individual things or, you know, un-redact, because there might be some use cases where there might be false positives there.

Nabanita De:

We have also added that option to do so. So, you can imagine if some of these are name, location, whatever - what it does it finds these things. Then, it replaces a name with the word "name, like it maintains the context. So that way when you're using tools like Chat GPT, your context is still maintained but your sensitive data has been redacted at the same time. And likewise, we do the same thing for multiple categories across - I mentioned: secrets in code, then a name, email ID, address, phone number, SSN licenses of different countries. All of these things are included as part of that - that people can use and leverage right now - and the way it's doing in the back end is basically we have leveraged some of the machine learning and natural language processing techniques to do this redaction and that's how it's working right now.

Debra J Farber:

That's awesome. So. . .and forgive me, since I'm definitely more familiar with data discovery methodologies but that's typically when you're looking across multiple data stores and you're trying to figure out what is personal data, what is sensitive data, right? So, for something like this, where it's more of a. . . almost like a DLP or, as you say here, a 'firewall,' that you want to only allow in the non personal data, non sensitive data that you're sending to the training set. I guess I want to better understand how you're figuring out whether something is personal or not. Is it like regular expressions? Are you using something beyond that type of matching? And, I don't know as much about NLP, so I'm just going to be up front here with you, just say, "No, that's the secrets.

Nabanita De:

I think there is a mix of multiple things - there are regular expressions, there also very commonly established state- of- the- art natural language processing techniques, like 'part of speech tagging,' where you can know what parts of speech, like what is now, what is what, what is what. And then there are also 'named entity recognizers.' There are multiple things within NLP that we leveraged to sort of achieve this at this time.

Debra J Farber:

That's awesome. So, I hadn't told you this yet. . . this shouldn't be too much of a surprise, but my other half is an offensive security guy. Right? He's a hacker, works for Yahoo on their Paranoids team. When I mentioned. . . I'm sitting there you're like, 'Oh, this is cool. Nabanita De just came out with Privacy GPT." this one. He was like "I want to take it apart and see what's there. You know, I want to go in and see what she built and see what I can find Right. And so not only did he not find any security issues with it, so I'm not surprising you on this public show, you know, with anything terrifying.

Debra J Farber:

What I wanted to call out was that he found it difficult to try to even find how it was structured but ended up saying that he thought it was impressive for how many countries you cover, basically - that you've got very large coverage with the algorithm that you've written, based on what he was able to detect. I thought that was actually really exciting. It sounds to me like you're not just coming to market in the United States, but you were thoughtful about how this could be used across the world. Is that how you would categorize it?

Nabanita De:

Yeah, absolutely. Thank you.

Nabanita De:

First of all, thank you so much for sharing this.

Nabanita De:

I think the thought process was that because I'm a privacy professional, that's my (even if I'm on the startup side) on the industry side, at the end of the day I want users to prioritize user privacy. My thought process there is like t"here is so much talk around privacy and how you can safeguard privacy and but can we empower general people first and sort of give them a window into what they could do and empower them with these kind of tools.

Nabanita De:

So, right now, another thing that I haven't mentioned is Privacy GPT also works if you don't have access to the Internet because, again, the data never leaves your system; it entirely sits on the browser- side, so you can use it when you may be on the transit and you want to quickly put something and redact that. It also works in that aspect. The second aspect from the privacy standpoint is also. . .that I know that when GDPR data is truly anonymized in that sense, then different privacy laws like GDPR do not really apply in that sense. That's another way from that standpoint as well.

Debra J Farber:

I think that's really thoughtful and I think it's also indicative of, well, your past experience, where you've had to work across so many different countries to get your efforts and your foundations to hit the maximum benefit of people. So, do you plan to build a similar product around other LLMs too, or is it going to be around ChatGPT specifically and that's it for LLMs?

Nabanita De:

No, I think we actually got a lot of great feedback when we launched on Product Hunt and we were Top 4. A lot of people wanted us to build around multiple LLMs, multiple browsers, add more features into it. So, we are looking into all of those things and that actually would form as part of the paid pilot process that we are launching. So, if anybody's interested, please feel free to reach out to me. And then, as part of building more products, like the space of detecting sensitive information I think that plays into so many different areas within privacy. So, now that I have this algorithm built out where I can accurately detect sensitive information, I plan to leverage that into many other spaces, like data inventorying and eventually leading that to doing DSARs appropriately and things like that. So, there are multiple other use cases that this algorithm will sort of federate into.

Debra J Farber:

Okay, awesome. Let's talk a little bit about what it's like to come to market. You mentioned Product Hunt. I see that Privacy GPT has ranked 4th globally on Product Hunt. Congrats, first of all, on having so many eyeballs on the product and then, of course, votes for it; I do have a few questions about that. So first, can you tell us a little bit about Product Hunt and how startup founders leverage it to prove market interest? And then, I think most of the products there are consumer products, but not all of them necessarily. So B2B privacy and AI folks, like myself, could benefit from learning a little more about how Product Hunt is used, if you don't mind.

Nabanita De:

Yeah. So, Product Hunt is essentially a platform where people launch their products into an audience of people who generally use multiple source of products. Ideally, it's like a voting ranking place where, once you launch the product, there's a community of people who are looking through those products of the day and then, if they resonate with what you have built, they'll upvote your product, they leave comments and reach out to you.

Nabanita De:

So, it's a great place to sort of put what you have built out there to sort of see if there is a general market need for that; and you can also specify what you're building. So, I have seen some B2B products also go into Product Hunt because, essentially, a lot of the different Fortune 500 company folks also go on Product Hunt to look at cool products of the day and then maybe even reach out based on if they saw something that resonated with them and they're interested in bringing the pilot out. I would say that this is a great space that if you have a product, like a cool product, that's been built out. You can sort of prove the niche market and release it out. It doesn't have to be consumer- focused, and you can add some demo or something around what your product does and what it's trying to do. Then, that could be like a great place to get the word out in terms of what you're building, what you're doing; and if people are interested, they could potentially reach out and talk to you.

Debra J Farber:

Awesome. So, do you find that that's typically, you know, like Heads of Innovation or internal business folks that are reaching out, or do potential investors reach out as well?

Nabanita De:

So, I've actually gotten both from our Product Hunt launch. I've had investors follow me and reach out. I've also had like general company folks from different Fortune 500 companies also reach out and show interest in our paid pilot - they have actually signed up for the paid pilot through the product. I linked our paid pilot into the Product Hunt page. I would say that good amount of. . . that's a place usually where a lot of. . . at least at Antler, I had seen a lot of people do Product Hunt launches. So, I would say that's something that startups frequently use to launch their products and sort of get some quick feedback from the market, I guess.

Debra J Farber:

Okay, awesome, thanks for that. I'm learning more and more. This has been a crazy market. You know, I've been focusing on privacy tech for the last three years and it's not hard to notice that raising money in this current economy has been a challenge. Do you think it's easier to raise money in the current economy if your product is related to buzz-worthy concepts like AI or is it about the same? Is it really difficult? Tell us a little bit about what the raise process has been like for you.

Nabanita De:

Yeah, I think we have been focused on finding the product market fit and talking to many people and, you know, iterating on our products and building. So, essentially haven't really gone into the raise part of the phase as aggressively as we would like yet; but, I would say that just based off of talking to different investors and people and hearing from them, especially in the Antler cohort and in different other. . . like New York Tech Week - I also went to that when I was in New York for the Antler cohort. I think just talking to them made me realize that if your startup is actually providing some sort of value and solving a clear niche area where you have a waitlist of people who are interested in building, in buying what you're building, are they interested in becoming a build partner or something like that? I think showing that level of traction and having a clear defensibility as to why you stand out, having a clear mode in terms of what is a unique selling point in this space and how do you you know what is a unique insight here, having

Nabanita De:

those things is very crucial when you're raising; and having that, I think, makes it easier to raise. Then, I think for early startup founders, that is something they sort of have to go through by talking to many people building, seeing the traction they have, seeing who are going to use that. So, it's a process to get there. I personally do not feel like just building in privacy or building in AI automatically qualifies somebody to raise. I would imagine that they, each founder, will have to do some sort of due diligence themselves to really put that together in form of a pitch deck or whatever methods that they're using to reach out to investors, angel investors, or VCs. And once they have that and they have that conviction and VCs on their end are able to do the due diligence and validate that this is a scalable venture backable business, I think then it becomes easier to raise.

Debra J Farber:

Yeah, that makes a lot of sense. You know it's been so hard to raise for privacy tech. I do wonder, though, if pairing that with AI like "h, we have a privacy tech solution that helps in the AI space" is kind of a sexier topic, or investors are just willing to spend more buckets on the space of AI right now. Well, you know, it's definitely something that we're kind of observing as it's happening, so we can assess that over the next year. What words of advice would you give for other software engineers that might be seeking a transition to founding a privacy or AI startup?

Nabanita De:

Yeah, I would say, especially in the privacy space, I think there's still a lot of education that needs to happen. Something I've observed is I have to really tweak my audience in terms of am I talking to somebody who is in the privacy space and knows about data inventory and DSARs and cookies and all of that stuff. But, for somebody who is not in the privacy space, we use the same technical jargon. They are like "what? So I guess you have to really think about who the audience is and really figure out a sweet spot so you can sort of simplify what you're really saying and they can understand that better. For software engineers, essentially, I think we bring in a lot of the technical hat and expertise or something I have personally like learned over time is take off my software engineering and builder hat and start really think from a product, from a business standpoint, because at the end of the day, as software engineers, we just want to build and shape and find that cool products that resonate in the market.

Nabanita De:

But, when you move over on the startup side, you don't have hundreds and billions of dollars to go spend and experiment. You are running pretty lean. You have limited funding, limited budget, so really prioritize what are you building, what are doing. In the privacy space, I would say there's a lot of different problems that need solving and anybody who is interested in this space, first of all, come talk to me; I'd love to talk to you. And the second thing would be really think more deeper in terms of what is the first problem you would solve in this space, because it's a huge space. How would you prioritize it? Then, how do you bring that value to the customers that that you're building for, how does it resonate with them? And then, sort of quickly iterate and build through, like really actively, consciously take off your builder hat and really get into the first principles of the 5 Whys, I guess.

Debra J Farber:

That's really good advice, kind of going back to basics with first principles. I think that makes a lot of sense. I've also seen a lot of engineers, in the privacy tech space at least, really deeply see one particular problem at the company they're at and go, "G osh, I'm tired of dealing with this problem in my company. If I can make this, maybe I could turn this into a product so that I solve not only the problem for this company but for I could then sell it to others. Right, and, and that has also been a great launching off point.

Debra J Farber:

But then, the challenge is "Wwell, I'm not necessarily sure I understand the privacy tech market and how the product gets purchased and how . . . there's just growth opportunities, I think, no matter how you're jumping into becoming a founder - especially, what I have seen is software engineers that get knowledgeable on privacy in one particular area, like maybe advertising and privacy and the nuances of the ad tech space. Right? Like, "oh, I know exactly that, I know this tech stack, I know how to solve these problems and this is how we can fix it," but then again, you don't necessarily know all of the problems that different privacy personas might have and how do you get them motivated to help get buy- in for the product?

Nabanita De:

I think knowing the space is very helpful in like in talking to customers. So, like you mentioned, if somebody has a personal story there that they felt a need of a problem and then they're solving that, I think it becomes so much easier when you're interacting with other people in the space because you intuitively understand what they're saying, what kind of problems they have. So, you can ask better questions in your product discovery calls. I would say that's an edge that engineers in this space who already have done some some privacy AI work would have when they think about starting their startups because, first of all, they have the network to tap into and talk to. Second of all, they intuitively understand how these different systems were, what kind of issues they have seen and overall, I think tying that to your personal mission - tying that what you're trying to solve and then building that out. I would say that's also an excellent point that you bring up. That would be a great start as well.

Debra J Farber:

Great. Okay, so we're getting closer to Data Privacy Day 2024, you know the end of January. So, I definitely wanted to ask you about privacy awareness, especially because I see on your website that Privacy License is building out a Privacy Champions Program. I'm curious, how does creating a Privacy Champions Program fit into your mission for Privacy License, and then what are your goals for this program and how do people join?

Nabanita De:

I think the reason why I created this Privacy Champions Program is I see there are so many people who can benefit from being in the privacy space. Essentially, privacy becomes a thing that is only resorted to you know, privacy managers or privacy lawyers where essentially the entire company should be contributing in some sort of privacy tasks because they do deal with sensitive information. So, my goal behind that program is can I empower entire companies and different stakeholders in privacy to understand how do they contribute to privacy, what tasks they could do, and sort of be the champion for privacy teams in their individual teams? That way, when your privacy team comes to you and be like "hey, I need to, I need you to build this for GDPR, you know exactly why that needs to be built and you can be the champion for privacy in your organization and incorporate like privacy by design.

Nabanita De:

To answer the question on how can somebody. . . so, there's a link if you go on my website right now - Privacy OS. ai - there's a link to sign up for the Privacy Champions Program and you can sign up for that. Once you sign up for that, we will be reaching out to you very soon in terms of joining the program. Your expectations and my goal is to set you up with a community of people who are also in similar spaces as you and you are, at the end of the day, the Privacy Champion for your team, for your organization.

Debra J Farber:

Yeah, that's great. So, tell me a little bit more about what this Privacy Champions Program looks like. I obviously get the idea that we want to get champions from across different companies to people will join this program. Just tell us a little bit about once they join. What can they expect to learn or to bring back to their organization? You mentioned, for instance, the GDPR and doing DSARs. What does that mean exactly? Is it a matter of your taking requirements from various legislation like GDPR or CCPA, and you're contextualizing what it means and kind of have a library - I don't mean a software library, I mean like just a library of privacy, knowledgeable privacy areas that someone could become more knowledgeable about and kind of standardizes across companies? Or, are you thinking of something else?

Nabanita De:

Yeah, I think that's something that I am doing in a different space within Privacy License, but I think my goal for Privacy Champions Program is where somebody who is not in privacy ideally wants to join, and I try to understand their motivation in terms of what they are doing right now, and then basically provide a list of recommendations or a weekly dose of recommendations of things that they could do in their organizations to prioritize more of privacy. For example, think about data teams, maybe in smaller organizations where they might be doing their own version of privacy or whatever, that sort. But being able to say that, "okay, if you want to get compliant," for example, with GDPR it necessitates that each team has their own version of inventory and that can bubble up to a company-wide inventory, and then you can know an accurate, up-to-date RoPA, so your Legal teams are not running after you. So, for me, it's like somebody maybe who joins from a Data team. I can be like okay, you can do these XYZ things to be prepared a little bit ahead in time, so that way, when you do come across all of these additional things, you already know why that's happening.

Nabanita De:

So, helping them understand a little bit more on what is different. Customer profiles - ideally, how do you contribute in privacy? How do you think about privacy- by- design? And then also, like across the industry, what other people who are also in the same roles as you, what are they doing in privacy? So, having that sort of community as well. So it's like doing both at the same time.

Debra J Farber:

That's pretty awesome. So what I'm gathering, then, is that a Privacy Champion doesn't just necessarily need to come from, like, a GRC team. It literally could come from any technical team or business team, or just wherever somebody wants to be the champion for privacy and bring back knowledge to their organization, and then this is kind of a place for them to start.

Nabanita De:

Yes, so it could be like literally somebody in HR or somebody like very like non-related, non-technical; but at the end of the day, anybody who touches sensitive data should be thinking about privacy. So, my goal is to how can I empower that those individuals?

Debra J Farber:

I love it. I love that you're helping to shape a better world, so thank you for your service. Before we close today, do you have any calls to action for the audience?

Nabanita De:

Yeah, like I mentioned, we have our paid pilot up. So, we'll be launching very soon. If you're interested in the paid pilot for Privacy GPT, go on the website PrivacyOS. ai and fill out like the paid pilot, sign up for it, and I will reach out to you individually. I would also love to talk to people. If you're listening to this and you're interested in talking to me and joining the Privacy Champions Program and you want to learn more about how you can you or your organization can benefit from it, please feel free to reach out to me on LinkedIn. Also, if you just want to talk about privacy and you feel like there are some burning privacy needs in your organization that you feel like look through multiple solutions but nobody is solving that, I would love to know what those things are. Come talk to me. Feel free to reach out to me and I would love to jump on the call. Another call to action will be

Nabanita De:

I also have a newsletter on privacy. It's on LinkedIn and it's free. It's called 'Nabanita's Moonshots.' Feel free to subscribe to it. I try to share my nit-bits around privacy, around consumer rights, around how can you safeguard sensitive information, post-principles for delete law and bunch of legal, technical and then GRC - multiple sides of privacy. So, feel free to give it a follow.

Debra J Farber:

That's pretty awesome, and so I'm going to put all of those links in the Show Notes so that everyone can access them, and I wish you good luck on the rest of your journey here. I'll definitely be following it. I plan to join the Privacy Champions, so, you know, look forward to being on part of the journey with you.

Nabanita De:

Yeah, Debra, you're already a Privacy Champion. I feel like you're doing such cool work, like this podcast itself. You bring on such great people and I always learn so much every time I hear a new episode from your podcast. So, I am truly honored to be on this podcast and share my journey with you and looking forward to all of the wonderful stuff that you will continue to do in this space.

Debra J Farber:

I really appreciate that. Thank you, and you know I definitely want to have you back on in the future to check in and see all the great progress that you've made.

Nabanita De:

Thank you so much and thank you everybody for listening to this.

Debra J Farber:

All right, well, Nabanita, thank you so much for joining us today on The Shifting Privacy Left Podcast. Until next Tuesday, everyone one will be back with engaging content and another great guest. Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website, shiftingprivacyleft. com, where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend. And, if you're an engineer who cares passionately about privacy, check out Privado: the developer-friendly privacy platform and sponsor of the show. To learn more, go to privado. ai. Be sure to tune in next Tuesday for a new episode. Bye for now.

Nabanita describes her career path
Nabanita tells us about Privacy License, its mission, and future plans for the org
What it was like for Nabanita to transition from working as a Privacy Engineering Manager to privacy / AI startup Founder.
How PrivacyGPT works in practices, with discussion of its architecture
Nabanita describes some of the NLP techniques that she leveraged to build PrivacyGPT, including: regular expressions, parts of speech tagging, and name entity recognizers

Podcasts we love