The Shifting Privacy Left Podcast

S3E4: 'Supporting Developer Accountability for Privacy' with Jake Ward (Data Protocol)

February 13, 2024 Debra J Farber / Jake Ward Season 3 Episode 4
The Shifting Privacy Left Podcast
S3E4: 'Supporting Developer Accountability for Privacy' with Jake Ward (Data Protocol)
Show Notes Transcript Chapter Markers

This week, I chat with Jake Ward, the Co-Founder and CEO of Data Protocol, to discuss how the Data Protocol platform supports developers' accountability for privacy by giving developers the relevant information in the way that they want it. Throughout the episode, we cover the Privacy Engineering course offerings and certification program; how to improve communication with  developers; and trends that Jake sees across his customers after 2 years of offering these courses to engineers.

In our conversation, we dive into the topics covered in the Privacy Engineering Certification Program course offering , led by instructor Nishant Bhajaria, and the impact that engineers can make in their organization after completing it. Jake shares why he's so passionate about  empowering developers, enabling them to build safer products. We  talk about the effects of privacy engineering on large tech companies and how to bridge the gap between developers and the support they need with collaboration and accountability. Plus, Jake reflects on his own career path as the Press Secretary for a U.S. Senator and the experiences that shaped his perspectives and brought him to where he is now.

Topics Covered

  • Jake’s career journey and why he landed on supporting software developers 
  • How Jake build Data Protocol and it’s community 
  • What 'shifting privacy left' means to Jake
  • Data Protocol's Privacy Engineering Courses, Labs, & Certification Program and what developers will take away
  • The difference between Data Protocol's free Privacy Courses and paid Certification
  • Feedback from customers and & trends observed
  • Whether tech companies have seen improvement in engineers' ability to embed privacy into the development of products & services after completing the Privacy Engineering courses and labs 
  • Other privacy-related courses available on Data Protocol, and privacy courses  on the roadmap
  • Ways to leverage communications to surmount current challenges
  • How organizations can make their developers accountable for privacy, and the importance of aligning responsibility, accountability & business processes
  • How Debra would operationalize this accountability into an organization
  • How you can use the PrivacyCode.ai privacy tech platform to enable the operationalization of privacy accountability for developers

Resources Mentioned

Guest Info



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Jake Ward:

The idea of 'getting ahead of the curve' always really appealed to me. I think it, honestly, appeals to developers as well, because nothing's worse than getting to the end of a process and having somebody say, "I see, you did this, you can't do that, go fix it. They'd much rather understand the constraints ahead of time and work around them. Solving problems is what developers are fundamentally there to do. They have to predict the future with every line of code. It's much easier to know where you can't go when you start that process.

Debra J Farber:

Hello, I am Debra J Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans. . . and to prevent dystopia. Each week, we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models and ecosystems. Welcome everyone to The Shifting Privacy Left Podcast. I'm your host and resident privacy guru, Debra J Farber. Today, I'm delighted to welcome my next guest, Jake Ward, Co-founder and CEO at Data Protocol, a developer support platform. Data Protocol enables platforms to get more out of their developer partners and their technical workforce, while giving developers the information that they need the way that they want it. Today, we're going to discuss: Data Protocol's privacy engineering courses and certification; how to deliver the shift left message for privacy so that it resonates with developers; and then, industry trends that he's seeing in the space. Jake, welcome.

Jake Ward:

Debra, it's a pleasure. Thanks for having me.

Debra J Farber:

Absolutely. So, first, I really want to discuss your background, because I find it fascinating. You started out in communications, even working as a Press Secretary for a U. S. Senator, before stepping into the role of President and CEO at The Application Developers Alliance, and now focused on developer education and developer support at Data Protocol. Tell us a little bit about your career journey and how you ended up focused on enabling software developers.

Jake Ward:

Yeah, my career has been a long and windy road, Debra. I had the good fortune of my childhood dreams and ambitions coming true - wanting to be on the cast of The West Wing and work in politics and with policymakers. I worked in the House of Representatives and the U. S. Senate for awhile. I also had the incredibly good fortune of working in the Senate for Olympia Snow from Maine during the initial Net Neutrality fight in 2006, 2007, when all of the issues around the future of the Internet came to pass sort of in an esoteric way that only people in DC would care about. But, the long and the short of it is that it was the first time that the Valley cared and understood that Washington mattered; and it was the first time that Washington understood the technology companies and the Valley mattered. I had the opportunity to have a front row seat for that. So, when I left The Hill in 2007, I worked with a lot of the now very large technology companies, as well as some of the smaller, to deal with public affairs issues around privacy and patents and other issues of the day.

Jake Ward:

That ultimately culminated in founding The Application Developers Alliance, which was the first trade association for software developers in 2012. This was an incredibly educational experience and brought me into the orbit of some real luminaries in the space: Joel Spolsky and Brad Feld and Don Dodge and folks like that. I'm dealing with Linda Smith from Twilio, back when Twilio was really up and coming. That experience shaped my worldview of, and particularly my relationship to, the developer workforce and the idea that I've been working towards since then that the men and women that build software, that design software, are largely the manufacturing class of the digital age and that it is incumbent upon the rest of us to understand and support that workforce if we're going to live in the world they're going to build for us.

Debra J Farber:

Wow, that is a fascinating career! I'm sure you have so many stories. I'm not sure if they'd be like West Wing style stories.

Jake Ward:

They're more like Veep. People would ask all the time, "is it more like House of Cards or West Wing? And I would say it's more like VEEP. Oh, that's funny, so almost like a comedy of errors. So, what happens when you give 24 year olds the keys to the country?

Debra J Farber:

It's true, you've done a lot on the Hill, so there's many interns and many - I lived in DC for five years, so I know many interns and young people getting into politics.

Jake Ward:

Old fitting suits and low salaries.

Debra J Farber:

Yeah, very much so. Okay, so let's tell us a little bit about the Data Protocol Platform. What have you built?

Jake Ward:

In 2017, when I left The Application Developer's Alliance, sort of the wake of Cambridge Analytica, I spent a couple of years traveling and meeting with software developers, working publishers of mobile apps, as well as front-end developers and systems engineers, to ask them one overarching question - "What do you need that you're not getting? Or, how would you prefer to get the information you need to build cool stuff faster but with less risk? At the end of this sojourn of a couple of years, what I had was, I believed, a roadmap to giving information to this growing workforce of 30 million or so. Information, education, support in a better way. They said, "I want video first. They said I want to get it faster. They said I want to be treated like I'm Bank of America or Air France, like I matter to a large platform.

Jake Ward:

I built Data Protocol with an incredible team and with empathy at the core of the interface, which any developer who logs onto our platform will recognize immediately that it's built for them. There's a CLI component to it, there's a hands-on keyboard five that is on every course, every short code, every resource, and we pack as much information and support material into a very short period of time, with the guidance of an instructional design team here in-house, so that we can get people back to doing what they came to do, which is build really cool products. The end result, I believe, is a platform that can give developers answers, while giving the companies that depend on them, that invest in them, that care about them, a bridge to communicate, support, and largely enable them to go faster with less risk.

Debra J Farber:

That's really cool. Tell us a little bit about how the platform works, how your client base is formed. I believe you're a nonprofit, is that correct?

Jake Ward:

No, no, we are very much for profit.

Debra J Farber:

You're a for profit, how does your business model work?

Jake Ward:

Anybody who would like to help with that profitability should call me immediately. We work with partners of all sizes, from Meta to Slack and Intel to the smallest. We have content coming up next week with Circle, the financial services company. We work with them to identify a problem for their developers. You want to help them navigate a privacy assessment or you want to talk to them about programmable wallets?

Jake Ward:

We build content, so we will be responsible for writing all of the scripts, shooting all of the video, animating the content, hosting it on our platform, putting in prefilled notes, writing assessments, questions, knowledge checks all of those pieces so that a partner can simply put a link to their channel on our platform in all of their dev docs, emails, website, et cetera. Send them over single point of login. They get all of that information: the resources, the video, the data. As a partner, you get access to attributable metrics as well as aggregate, "How is my community behaving? Do they like this information? Was this useful? What should I change about my product or my process to improve value to that developer community?

Jake Ward:

The end result is tens of thousands of developers on data protocol. com using the courses and resources and guides and documents every day. While these are partners, companies are able to extend their existing developer programs. There's nobody that publishes an API or has an SDK or generally relies on developers that hasn't invested in developer documents, email communication, events, programming, etc. We fit right inside that to extend and deepen the developer experience on behalf of those partners and give developers what they need faster in a format that is more useful to them.

Debra J Farber:

That is really compelling. I know that you provide these courses. Most of these courses are free. Are all of them free?

Jake Ward:

All of them are free. All of the courses are free.

Debra J Farber:

What's exciting to me is you're finding out the needs of what one company in particular might need. They're just like we need this support for our developers on these issues. After they pay you to go develop the course, the course is free for anybody. I love that it's building the community outside of the organization too. There's something to plug into. Well, I guess it benefits that company because, as they're hiring, they could say "Hey, have you taken this course? It's available to them. You don't have to first be an employee or anything like that. The knowledge is shared. I just love that about community building.

Jake Ward:

That's right. Wed do. There's a lot of content on the platform that is even product agnostic. We did five tips to compliance with GDPR. We didn't build that with a partner. We built it with subject matter experts in order to support the community at large. We do a lot of internal training for some very large companies, particularly around privacy engineering, which I know we'll talk about in a second. Our real bread and butter is with companies who are either managing very large third-party developer programs so people that build on their tools or have a developer user-base : customers, people that are paying for a service, use of an API, use of a metric dashboard, et cetera. We help be the on-demand, scalable partner management arm of that company.

Debra J Farber:

Got it. That makes sense. What does the concept of 'shifting privacy left' mean to you?

Jake Ward:

I think that my passion point around shifting privacy left is a little different than is conventionally agreed upon among privacy professionals because I come to it from a developer bent, but I really like the idea and the metaphor of pushing accountability and responsibility further to the left, into the arms of the people building the products, that if you can align the responsibility, accountability and decision-making with the people whose hands are on the keys designing the product and building it and they take responsibility for it, you're going to have a safer product. I want to occasionally tell the story of the Roman bridge makers who, after they were commissioned to deliver a bridge, would be forced to sleep under it with their family while the first legion of the Roman army marched over their heads on that bridge. Wow, the idea of sleeping under your product. Did I do this well? Will this be secure? Putting your name on it wasn't enough. If we could ask all developers to sleep under their code, to be responsible and put their name on it, not just their company's name, their name, how much better would products get. How much time would be dedicated to the front-end design? How much privacy by design, the principles would be incorporated earlier on with privacy engineering.

Jake Ward:

Marketers are always in the room. Why can't we have a couple more lawyers? Why can't we have some privacy experts? The idea of getting ahead of the curve always really appealed to me. I think it honestly appeals to developers as well, because nothing worse than getting to the end of a process and having somebody say, "I see, you did this, you can't do that, go fix it. They'd much rather understand the constraints ahead of time and work around them. Solving problems is what developers are fundamentally there to do. They have to predict the future with every line of code. It's much easier to know where you can't go when you start that process.

Debra J Farber:

Absolutely. I think your definition actually really aligns at least with my definition. Now, when I explain what shifting left means, I usually talk about how you want to address privacy problems and preventing them earlier on in the shift from a mindset of just when data is collected and the data life cycle through its destruction and shift into the software and product development life cycle when you're building the products and systems that personal data will be housed in; and how, if you do that, you could prevent a lot of the downstream compliance problems. You build it right the first time. It makes most sense, but absolutely agree. In fact, I'm going to probably name this episode like shifting accountability left for developers or something like that, because you're absolutely right. It's just another emphasis on an outcome for shifting left is that, if you make the developers accountable and not only responsible, but also accountable for their own code and it being safe - and that can mean other things beyond privacy - then you will have a more safe product. The output is going to be safer. The process is going to make sure, if you have a good framework, is going to make sure of that. It behooves anybody to do that in any organization. I want to make 'sleep under your own bridge that you create' the new 'eating your own dog food.' Let's make that a thing. I absolutely love this story and I'm going to start retelling it as I continue my advocacy in this space. Thanks for that. [Jake: You're welcome.]

Debra J Farber:

Let's talk about Data Protocol's Privacy Engineering courses and the certification that you can opt into that is led by renowned privacy engineering instructor, Nishant Bajaria. For the audience, I have included a mention of this course in my episode - t wo episodes ago, my 50th episode, "My top 20 privacy engineering resources for 2024. It is definitely belongs in that top 20. I am just excited to dig deeper here and have you pull out, Jake, the reasons for why this course is so helpful for supporting developers. First, tell us a little bit about the course, the lab component and the certification, and then what will developers come away learning?

Jake Ward:

We started this. We were incredibly fortunate to work with Nishant in the earliest days of Data Protocol. This was our first big tent moment. I joke frequently that the platform is so good we could even teach privacy. When we set out to do privacy engineering, it was perfect timing because Nishant's book was just coming out and the content was very fresh. He's incredible on camera, obviously, in a compelling figure. We built an eight course, six lab curriculum around privacy engineering and there's also a fairly substantial final assessment, at the end of which you are a Certified Data Protocol Privacy Engineer.

Jake Ward:

We had one goal when we began this process, which was to make privacy engineering publicly accessible, to democratize it. Certainly, you could go to Carnegie Mellon, you can be in the world, you can do the work-life experience much like Nishant has done, but that takes years or costs lots of money. We wanted to build a bridge between engineers and lawyers and privacy professionals, to speak that language as best we could, and delivered in a format that was digestible and operational. I frequently say that there are two types of smart people: those who can make simple things sound complex and those who can make complex things sound simple. Nishant is certainly the latter and he operationalizes all the privacy, engineering principles throughout these courses, throughout his book, and we were delighted to bring that to life.

Jake Ward:

The key to the platform as far as I'm concerned is how compelling it can make any content. It's a lot easier when the content is also pretty compelling, but with hands-on keys, particularly in a lab setting, we can keep people's attention. The retention goes way up. Proficiency goes way up. Our passage numbers are significantly higher than industry standard, and it's not because it's easy. It's very hard for the certification. The key to the experience, though, is that it is step-by-step, it builds on top of each other and it is operational, so that you'll remember how to take it out into the world, as my kids talk about at school. It's project-based, and so you're not learning it so that it's rote. You're learning it so you know how to use it.

Debra J Farber:

That's awesome and as someone who's neurodiverse like me I find specifically ADHD. Sometimes it's like what is my next step? There's all this stuff and this chaos. What's the first step to take? So when I can come across some materials that can basically be like here's the first step, then then do this, then this is the order that makes the most sense, suddenly everything becomes much clearer. It's not that I can't figure these things out, it just might be overwhelming at first. So, I really appreciate that that's inherent in how you designed the coursework, because that's how I learned very well, so it's great. How long has the course and the certification been available now? Oh, and also, what will developers come away learning? I'm not sure if you mentioned that.

Jake Ward:

Developers, lawyers, privacy professionals generally will come away understanding sort of the keys to categorization, to the anonymization and to the way that you can integrate privacy engineering principles into a system - so the architecture components of it. The idea was to build a curriculum that was comprehensive enough that you could start on Nishant's team after taking it right, that you could be ready to at least understand the vernacular and the concepts and be part of the that universe. I like to think that we achieved that goal, certainly, as far as I'm concerned, the best privacy engineering program out there. But, there are also many other resources that are going to stand the test of time. I am thrilled that there are more and more of them.

Jake Ward:

The idea that we're competing is pretty silly, like I just want more and more people to understand that these principles are out there and that you should understand them and that they should be adopted and implemented and widely available in the wild for as little money as humanly possible. The program, I think, launched in June two years ago, so we're going to come up on two years in June that these have been around. Numbers-wise is really how we take a look at it. We're, I think, well over a thousand certifications now a thousand badges now, as well as 4,300 hours of content consumed since we launched, which is pretty good. It's an interesting way of keeping a score on the engagement level of things.

Debra J Farber:

Yeah, so you mentioned certifications and I don't think we made a good distinction between what the process of taking the course is and then there's a separate process of opting into certification and what that means. Do you mind disambiguating that?

Jake Ward:

No, no, not at all. As I said, there's eight courses in six labs. They run in sequential orders. You can't skip ahead. You can't get out of order. That path is locked. At the end of each section there is a badge, relatively short assessment to achieve that badge and then you move on. The courses are not particularly long. I think the longest one is 30 minutes, the shortest one is probably three or four - it's just an intro component at the front end. And then, there is a final assessment.

Jake Ward:

Now all of the content, all the courses, all the guides are free. We do charge for the certification assessment because, frankly, we have to pay for it. So we ask people to pay for it as well. You can take that assessment a couple of times, but it's going to take a while. It's probably an hour long, 50-minute-long assessment. I think it's 100 questions, and the idea is to make you really earn it, that if you're going to get this certification - and it's issued by Credly; it lives on your LinkedIn; you can put it on your resume. If you're going to do that, we want to make it count.

Debra J Farber:

So, you basically reflect back the learnings in a. . . It's not just you took a course, but you can demonstrate that you understand it.

Jake Ward:

Yeah, I mean, it's open book. The resources and notes are there, and the way the platform is set up. You have access to all of that stuff; but, you need to demonstrate that you have the proficiency to implement that in a very operational way. It's scenario-based, which I think is incredibly important and a much better way to evaluate somebody's comprehension and retention of information, particularly the complexity around this. Now you've asked me earlier, "Can a lawyer take this? Can a privacy expert take this? Absolutely. There are hands-on labs. They are more technical, but you're not going to need to go learn JavaScript in order to participate and get certified.

Debra J Farber:

Excellent, that's great to know. And then, what trends have you been seeing since it was offered? What feedback have you gotten on. . . not so much the course I'm not asking like do people like it but just overall. What are thoughts and feedback around what can be improved? What gaps of knowledge do people still have? What trends are you seeing overall?

Jake Ward:

It's actually really interesting because we ask people "Do they like it? artly because we want to continue to iterate and improve the platform and the features and functions that we offer each user, but also because it's a matter of deciding which content to invest in moving forward.

Jake Ward:

eople universally really enjoy this content. Anybody who's made it through at least two of the courses has near universal approval of the format, the function, the operationalization and why they're there in the first place. We even ask questions like "is this what you thought it would be? Some people say no, but then they're delighted with the result. The trends are actually sort of fascinating.

Jake Ward:

The vast majority of content consumed is on weekends, which tells us that these are professionals that are setting aside time out of their lives to go take these courses so that they are better prepared to do their day jobs. I think that is - I'm reading a little bit between the lines, but I think it's a really good sign to the future of privacy and for the industry writ large that people are giving of their own time in this way. This is not corporate training. This is something else. This is the ability to improve your knowledge base as well as your abilities at work. I think that's really interesting. Since we met what two years ago, we have seen a huge increase in privacy engineering as a topic, available curriculum as resources and guides. I mean, as your Top 20 List indicates, this is coming. That's good news for everybody.

Debra J Farber:

Excellent. That's really great to hear. Just for the audience, I had met Jake maybe about two years ago, just shortly after this course was made available at one of The Rise of Privacy Tech's . . . at their very first in-person event in Silicon Valley. It's just amazing to catch up with you now and see how impactful the work has been. It's a lot of large tech companies that you work with, which send their engineers to take these courses at data protocol. To what effect have they seen improvement in their ability to embed privacy into the development of products and services within their orgs?

Jake Ward:

For the companies who are sending their employees, their technical teams, through the privacy engineering courses certifications.

Jake Ward:

eW have gotten, and continue to get, really good feedback that everybody's on the same page, everybody can speak the same language, now that decisions are made more quickly and in the right direction more frequently. I don't want to name names, but some very large companies that you wouldn't necessarily have thought of as A) digital or B particularly concerned with privacy ow feel like they have their arms fully around the systems that they need to have in place to protect both their existing data but the future data that they'll be collecting. We also recently launched a DSARS course and badge: courses and badge, curriculum and badge that are often gobbled up by people that are also taking PrE. So if you're a partner and you send 43 of your engineers through the privacy engineering curriculum, a pretty good chance that at least half of them are going to opt into the DSARS courses, which are not. I mean, they're just there, they're just available but these folks are looking to continue that education. and again, that's a great sign for both that workforce, that company, and for the industry at large.

Debra J Farber:

Absolutely. You mentioned you have the DSARS courses. Do you have other privacy related courses or any in the pipeline?

Jake Ward:

We do. We have a number of privacy- related courses that we put out sort of in tandem with the privacy engineering curriculum, the idea being that when people are here, they're going to want to touch other privacy related courses. So our initial, our very first course that we launched with was a pPrivacy by design principles 101. What does it mean? How do we use it? Course? We have, as I mentioned, the GDPR course. We have some other components around building apps for kids and the privacy restrictions related to C. We have several pieces of content that are directly related to designing for privacy, engineering for security, thinking about data storage, encrypting end- to end. Courses like that that are Sort of soft, skilly, right, like thinking about it from an architectural design standpoint, rather than use this cloud and plug it in this way and this is the tool you want to use. It's more like choose the tool that fits your need. Here's how it works, and that content is all product agnostic, platform agnostic and delivered by subject matter experts. It's really good stuff.

Debra J Farber:

That's great. I have to check some of those out.

Jake Ward:

I'd love to get your feedback.

Debra J Farber:

Yeah, yeah. I think I'm going to do a little binging this weekend, ake, while I have you here as a comms expert working with developers, I'd love to hear some thoughts on, lik, what are some better ways for us generally, in organizations to support developers and being able to message up challenges around privacy that they come across. Are they not not heard? Do they not know where to. like where are some of the gaps that you see in industry right now that developers have challenges smounting those gaps where communications or other tools or something could be helpful?

Jake Ward:

It makes a ton of sense, and I also think it's the right question. First and foremost, the most persistent communication- related challenge I've seen over the last 12 years is that there is a fundamental misunderstanding or misconception.= I guess that developers don't care about privacy, tj they don't care about the idea, the concept of privacy, and that's just not true. The people that build, whether it be large software packages or small mobile app software packages or software in general they care as much about privacy as anybody else and often more Righ They are users just like everybody else.

Jake Ward:

The idea that anybody would want to build something that has a propensity to break or that it would eventually run into a brick wall of legal restrictions or regulatory compromise is ill-informed. Developers care, they care a lot. What they frequently don't have is the right support while being pressured to go faster. Product teams, engineering teams, even startup ecosystem right? everything's about speed to market. Can you get it done? Will this work?

Jake Ward:

And, that the integrity of the product from a failure standpoint is largely about the engineering, not the use of data. The reason this company is called Data Protocol is because it's about the rules that govern data and how everything that we do in this digital ecosystem has to come back to that. Whether it's about building products that work, or building products that are compliant, or building products that are simply improving the way that systems talk to each other, there has to be an alignment around the protocol related to data. We're helping to bridge that divide every day, every chance we get where developers who care about privacy, quality, and speed have the information they need. But, I would start by assuming that developers want to do the right thing for the right reason and go from there.

Debra J Farber:

I think that's really important to reiterate, because even I'm guilty of thinking in those terms sometimes. But it's really like, yeah, it's coming from pressure of executives at startups that have maybe borrowed a crap ton of money from investors and now feel like they have to hit the market hard. . o. r, if it's not startups, it's large companies that have investors that they need to meet certain metrics and such, and that's where the move fast and being pressured to go faster, while you don't necessarily have all of the requirements even for an MVP that would make the product safe, like privacy and security and ethics generally. That makes a lot of sense. It's not the developers themselves that don't want to address privacy, but a lot of it's just been.

Debra J Farber:

I've heard a lot of developers in the past - and I've been doing this a long time, so I'm not calling it from recent experience - but a lot in the past just say that it's not important enough to interrupt their flow. That was really because they didn't understand what it really meant and how large the field of privacy is and what it really means. I think again, education was super important. The field's gotten smarter about it and then it sounds like the support you're providing can really enable them to maybe move fast still, but have the right support. What I'm curious about, though - because we've talked about the importance of now making them accountable - what are the ways that companies can make their developers accountable?

Jake Ward:

I think, by making the assumption that they, too, care about privacy, have them be part of the process from beginning to end, much like privacy professionals have asked to be part of the design process from beginning to end. The culmination of that is teams, departments, full companies, are more bought into not just the practicality of checking a box but the potential of achieving success. Yes, it is compliant, but it also has a market differentiator of being privacy- first. Yes, it is a great product, but it also won't break. Yes, we got to market first or fastest, but we also can maintain our position as a market leader because we can market ads privacy first, secure and here for our users. We've seen that in recent years. Apple's move into being a privacy company was a masterclass in marketing. They didn't do it differently, they just attacked the other players in space and differentiated themselves from everybody else from a marketing perspective. That's not going away. People care about that when they're thinking about the products they're going to buy. So should then the people who are building those products.

Debra J Farber:

Absolutely. You're right it is a masterclass in marketing because there are some things that they were doing and collecting data just like all the other big tech companies, but because they did actually, I think, differentiate how they architected their hardware, in the way that they've locked down some of their, in the way that they've architected the hardware of the phones and their computers, some ways have been privacy- first to begin with, but in other ways that they've collected data have been just the same as the big tech companies. When there was a snafu in recent years because they had built up so much trust with their amazing marketing campaign that differentiated that by saying how much they care about privacy, it didn't impact them much. The goodwill was just was there. Their stock wasn't impacted by a major snafu. So, it's a great comms marketing approach.

Debra J Farber:

But you're right, this approach with the developers will then support this whole company messaging capability to really put the company in the best privacy-focused light. So, what you discussed with me, it sounded like you want to make them partially responsible, but accountability - is like you own this. Who do we blame if this is wrong? And still, I see something privacy- related, it goes wrong, people will blame a privacy person. Or, if there's a breach, maybe they'll blame we didn't have the right security Like there's not necessarily like, "W maybe you should have threatened model better how do we get the developers to actually like be accountable for the privacy harms they cause? Is there a way to psychologically tie them to metrics where they'll want to make sure that they don't do certain things? They could be moving fast and want to do all the right things but still have oversight or, you know, have overlooked a few things, or maybe didn't use the right design pattern, maybe use the accidentally made a mistake and you know data is now exposed.

Jake Ward:

Again, the idea for me of shifting left is to align accountability with responsibility and then, if you're pushing that left so that the developers are part of that process, what you've now done is create an internal accountability so that you've . . . the incentive to not let it break, to not let it be a mistake, to not let it do harm is already there; but, what happens too often is, once developers are no longer part of that process, it goes to the lawyers, it goes to the privacy professionals. They say "Look, it's not my fault, I did everything that I was supposed to do and then I handed it to you and you didn't fix it. You didn't create the structure around it, the rules around it, the pieces weren't in place to guide me through that process. But if everybody is on the same page, that process is built in a collaborative way. Everybody's accountable and that that's the goal.

Jake Ward:

It's not to not put everybody in different silos, but to put them in swim lanes moving in the same direction. Developers build to achieve an end. One of those ends is a privacy protection. Great, tell me what that means. Tell me what protection you need me to put in place. Tell me what the design pattern needs to look like, so that we can be assured that we can still go fast. Mistakes are made out of ignorance and desperation, not out of any sort of ill intent. If we can eliminate both of those things the time pressure and or the lack of support you are aligning responsibility and in doing so, that internal accountability takes care of itself.

Debra J Farber:

In listening to your talk. The way I would operationalize this in an organization would be to obviously discuss with everybody and get to who owns what is the process, what should that look like? But then document that in some sort of governance and accountability policy, then SOP. What I'm hearing is I would make sure that there are privacy requirements that must be listed in every single product development set of requirements. [Jake: Absolutely. I] would make sure that the developers had testing criteria to those privacy requirements and that maybe different parts of the business, like the operational CPO parts of the business - as well as, if they wanted, Privacy Counsel - could also look at it as not as doesn't have to have their eyes on this as constantly as operations - would be able to see what's coming up in the next sprints and what is on the product roadmap.

Debra J Farber:

This is something that is not optional. This is like they're getting these updates. They have to know and be accountable for reading them and understanding how these products. . . the product development and software development are moving towards goals, and then have the document what are those end goals we're all striving to, so that we are in those swim lanes moving towards the same direction, because you want to align all of those requirements across the business, across legal, across software development and security, because you'll want to align them across other business areas as well. It shouldn't be, are there privacy requirements for this? It should literally be anytime personal data is being used that we literally get product managers know to ask what are the privacy requirements for this and you're working a cross business to have that so that it really is built into the process and tech stacks and coding and design requirements. All of that should be aligned to whatever those end goals are.

Jake Ward:

I think that's hundred percent right, Debra. I would also add that those policies and processes should not be handed down from the policy team, from the privacy team, to the developers. They should be built in collaboration with the developers so that it is more readily operational and that you have buy-in from the outset.

Jake Ward:

The idea that developers build and lawyers lawyer is a problem. If you put those two people together and say, h"Here's the standard we have to meet, do you have best practices from a development standpoint that you would put in place to achieve that?" great, standardize it. Do you have best in class standards that you would put in place if you had time, if you had more resources? Is there a better way to do this? Great, that's your gold star. Then you're creating growth for your organization, but you're also putting privacy experts, legal experts, and engineering experts in the same room to pull out of them. A best path forward for all of those organizational successes which is not, coincidentally, what a privacy engineer does is to pull all those things together and put a stamp on it.

Debra J Farber:

Absolutely. That's why I think operational people are so essential in privacy. It's not just about the expensive lawyers and engineers that do their thing. It's also about those that implement into the business; look across business processes, which hello is how information flows through business processes and systems. This is all coming from just a post I've seen recently I want to put out there. I don't think it's as successful in a business to have lawyers talk to engineers by themselves and then thinking they should just go implement stuff. There's a lot involved in the implementation of policies and procedures, in the business requirements separate from engineering. Then, you really need somebody in a mid-sized, large, or enterprise organization, you need people to make this happen and own privacy in an organization. For me that is not sitting in legal or sitting in engineering, but you don't have to have an opinion on that unless you want.

Jake Ward:

I think that's all right. I think that's all right.

Debra J Farber:

Yeah, I'm just going to put in a plug for PrivacyC ode that Michelle Dennedy and Kristy Edwards co-founded. I think a lot of the work that they are building out in their platform - it's the tooling they wish they had when they were working in privacy. Michelle Dennedy was the CPO at Cisco, among other amazing accolades. Kristy Edwards is her technical co-founder, who's been working at Splunk and executive- level developer for many years.

Debra J Farber:

These are the tools that they wish they had when they were in those operational roles. So, I want to put a plug in for them because a lot of what you just talked about - how do you effectuate that alignment not just the education - you've got a lot of that support for developers and how they can learn and how they could implement, but the actual alignment in the business - PrivacyC ode is working on libraries for privacy stuff and for different patterns, and how do you align maybe the values of the organization and the privacy requirements. What does that mean for different areas of the business and such.

Jake Ward:

Michelle was an original Advisor to Data Protocol. We're huge fans and she and Kristy are doing an incredible job with PrivacyCode.

Debra J Farber:

Amazing. Do you have any words of wisdom to leave the audience with before we close?

Jake Ward:

I'm hopeful that many of your listeners will log on and start taking some of the privacy engineering courses. Again, they're all free. I'd love feedback from anybody about how they are being received, how useful they are, things that we can do to continuously update the content and to improve the experience.

Debra J Farber:

Excellent. Well, Jake. Thank you so much for joining us today on The Shifting Privacy Left Podcast. Until next Tuesday, everyone, when we'll be back with engaging content and another great guest. Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website, shiftingprivacyleft. com, where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend. And, if you're an engineer who cares passionately about privacy, check out Privado: the developer-friendly privacy platform and sponsor of the show. To learn more, go to privado. ai. Be sure to tune in next Tuesday for a new episode. Bye for now.

Jake's origin story, including his experience on Capitol Hill as a Press Secretary for a U.S. Senator
The Data Protocol platform, which offers educational support for developers
What 'Shifting Privacy Left' means to Jake
Data Protocol's Privacy Engineering Courses & Certification Program
What developers will learn from taking Data Protocol's Privacy Engineering Courses
The difference between Data Protocol's free Privacy Courses and paid Certification
Feedback on the course and & trends observed
Whether tech companies have seen improvement in engineers' ability to embed privacy into the development of products & services
Jake discusses other available privacy-related courses, and courses that are on the roadmap and ways to leverage communications to surmount current challenges
How organizations can make their developers accountable for privacy
Debra describes how she would operationalize this accountability into an organization and Jake shares his thoughts
Debra highlights PrivacyCode.ai, a privacy tech platform that enables the operationalization of privacy accountability for developers, aligning to other areas of the business

Podcasts we love