The Shifting Privacy Left Podcast

S2E21: Containing Big Tech, Federal Privacy Law, & Investing in Privacy Tech with Tom Kemp (Kemp Au Ventures)

July 11, 2023 Debra J. Farber / Tom Kemp Season 2 Episode 21
S2E21: Containing Big Tech, Federal Privacy Law, & Investing in Privacy Tech with Tom Kemp (Kemp Au Ventures)
The Shifting Privacy Left Podcast
More Info
The Shifting Privacy Left Podcast
S2E21: Containing Big Tech, Federal Privacy Law, & Investing in Privacy Tech with Tom Kemp (Kemp Au Ventures)
Jul 11, 2023 Season 2 Episode 21
Debra J. Farber / Tom Kemp

This week’s guest is Tom Kemp: author; entrepreneur; former Co-Founder & CEO of Centrify (now called Delinia), a leading cybersecurity cloud provider; and a Silicon Valley-based Seed Investor and Policy Advisor. Tom led campaign marketing efforts in 2020 to pass California Proposition 24, the California Privacy Rights Act, (CPRA), and is currently co-authoring the California Delete Act bill.

In this conversation, we discuss chapters within Tom’s new book, Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY; how big tech is using AI to feed into the attention economy; what should go into a U.S. federal privacy law and how it should be enforced; and a comprehensive look at some of Tom’s privacy tech investments. 

Topics Covered:

  • Tom's new book - Containing Big Tech: How to Protect Our Civil Rights, Economy and Democracy
  • How and why Tom’s book is centered around data collection, artificial intelligence, and competition. 
  • U.S. state privacy legislation that Tom helped get passed & what he's working on now, including: CPRA, the California Delete Act, & Texas Data Broker Registry
  • Whether there will ever be a U.S. federal, omnibus privacy law; what should be included in it; and how it should be enforced
  • Tom's work as a privacy tech and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest in a startup or not
  • What inspired Tom to invest in PrivacyCode, Secuvy & Privaini 
  • Why having a team and market size is something Tom looks for when investing. 
  • The importance of designing for privacy from a 'user-interface perspective' so that it’s consumer friendly
  • How consumers looking to trust companies are driving a shift left movement
  • Tom's advice for how companies can better shift left in their orgs & within their business networks


Resources Mentioned:

Guest Info:

Send us a Text Message.



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Show Notes Transcript Chapter Markers

This week’s guest is Tom Kemp: author; entrepreneur; former Co-Founder & CEO of Centrify (now called Delinia), a leading cybersecurity cloud provider; and a Silicon Valley-based Seed Investor and Policy Advisor. Tom led campaign marketing efforts in 2020 to pass California Proposition 24, the California Privacy Rights Act, (CPRA), and is currently co-authoring the California Delete Act bill.

In this conversation, we discuss chapters within Tom’s new book, Containing Big Tech: How to Protect Our CIVIL RIGHTS, ECONOMY, and DEMOCRACY; how big tech is using AI to feed into the attention economy; what should go into a U.S. federal privacy law and how it should be enforced; and a comprehensive look at some of Tom’s privacy tech investments. 

Topics Covered:

  • Tom's new book - Containing Big Tech: How to Protect Our Civil Rights, Economy and Democracy
  • How and why Tom’s book is centered around data collection, artificial intelligence, and competition. 
  • U.S. state privacy legislation that Tom helped get passed & what he's working on now, including: CPRA, the California Delete Act, & Texas Data Broker Registry
  • Whether there will ever be a U.S. federal, omnibus privacy law; what should be included in it; and how it should be enforced
  • Tom's work as a privacy tech and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest in a startup or not
  • What inspired Tom to invest in PrivacyCode, Secuvy & Privaini 
  • Why having a team and market size is something Tom looks for when investing. 
  • The importance of designing for privacy from a 'user-interface perspective' so that it’s consumer friendly
  • How consumers looking to trust companies are driving a shift left movement
  • Tom's advice for how companies can better shift left in their orgs & within their business networks


Resources Mentioned:

Guest Info:

Send us a Text Message.



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Tom Kemp:

The AI is using all that data being collected and it's not only automating decisions, but we've now seen the use of generative AI to create text, images, etc. But one of the ways that Big Tech is using it is to keep people on their products, to make their products more addictive, to get more attention so we're obviously in the 'attention economy,' where we're competing for eyeballs.

Debra J Farber:

Welcome everyone to Shifting Privacy Left. I'm your host and resident "privacy guru, Debra J Farber. Today, I'm delighted to welcome my next guest, Tom Kemp. Tom is a Silicon Valley-based author, entrepreneur, investor, policy advisor, and now author. Tom was the Founder and CEO of Centrify (which is now called Delinia), a leading cybersecurity cloud provider that amassed over 2,000 enterprise customers, including over 60% of the Fortune 50. For his leadership, Tom was named by E Y as finalist for 'Entrepreneur of the Year' in Northern California. He is also an active Silicon Valley angel investor with seed investments in over a dozen tech startups. In addition, Tom has served as Technology Policy Advisor for political campaigns and advisory groups, including leading the campaign marketing efforts in 2020 to pass the California Proposition 24, otherwise known as California Privacy Rights Act (CPRA), which amended the CCPA. He's co-authoring bills such as the California Delete Act of 2023. Welcome Tom.

Tom Kemp:

Hey, great to be here.

Debra J Farber:

Excellent, I'm so glad you're here. You're doing so many interesting things out in the privacy tech market, just working, like myself, working at the broader, overarching levels rather than just one initiative. But I'd love to first kick things off by diving right into your new book. It's get to be released. It's called "Containing Big Tech: How to Protect Our Civil Rights, Economy and Democracy. What motivated you to write the book and who did you write it for?

Tom Kemp:

Well, thank you. First of all, I wanted to create a simple and comprehensive look at issues concerning big tech that your Uncle Larry or your average politician could say, "h-ha, i get it, without having to hand them 500 articles and say, "go at it. It actually works to Big Tech's advantage that there's confusion on these issues. I wanted to just give a simple, but comprehensive, look at what are the issues that have arisen, not only as it relates to privacy and digital surveillance, but also persuasive technology and, of course, the big kahuna, artificial intelligence and really trying to deeply dive into how the largest tech players are using AI, both in a positive way, but also potentially negative ways as well. Second, I also wanted to say there are actually some simple solutions out there to many of these problems, which I provide - solutions both for consumers, your Uncle Larry, but also for policymakers as well. Then, finally, I wanted to explore some of the latest and greatest things that are happening.

Tom Kemp:

One of the big things is the Dobbs decision, we're now in a post-abortion rights America and that has some significant impact on privacy. But also, we've had the rise of TikTok and with the whole addictive technologies, persuasive technologies to keep people hooked on it. Then, finally, AI, and obviously people are looking to draft new laws to take into account AI, but I really wanted to give the readers a deep dive on what AI is, how AI can be used for good, how AI can be biased or exploitation, and how it relates to the collection of our information as well. Those were some of the motivations behind the book.

Debra J Farber:

I love it. I have an advanced copy here, a galley copy, and I really like how accessible you've made everything. So, literally, my Uncle Larry could pick this up. I don't have an Uncle Larry, but I have an equivalent, can pick this up and easily understand the challenges that you identify without too much technical jargon getting in the way. It's one of the things I really love about this book; it makes it really easy and clear to understand the potential problems.

Tom Kemp:

Well, thank you. I think the interesting thing is, it's incredibly timely; every day there's a new headline, either about AI or about privacy, or there's new laws, etc. I think this is also very timely and people need to keep up on it, not only Uncle Larry, but privacy experts such as yourself.

Debra J Farber:

Oh yeah, definitely. We're all working at such quick speeds these days as technology advances or there's constantly new iterations of whether it be AI or new tech, that it's really hard for even privacy experts, security experts, to know everything that's going on. This is really helpful. I like how each chapter focuses on a particular area of concern. You've got 'digital surveillance,' 'data brokers,' 'artificial intelligence,' 'persuasive technology,' 'kids' online safety,' 'extremism and disinformation,' and 'competition.' If you don't mind, I'd like to go through each of those topics with you and just have you give a short synopsis of each of the privacy, surveillance, or safety harms. Does that work?

Tom Kemp:

Yeah, absolutely. I mean to summarize the themes just to put those chapters into context. I fundamentally believe that we have five tech companies that are monopolies, that are some of the most powerful corporation the world has ever seen, with amazing reach. We're talking about companies like Google. They have 4 billion users. That's half the Earth's population. So, we're talking about companies with incredible reach. I mean, GM took 100 years to sell hundreds of millions of cars and within 20 years, google has gotten to 4 billion users. And they're very much unregulated and they're causing serious threats to our civil rights, our society, our democracy.

Tom Kemp:

So, if you take a look at the first three chapters: digital surveillance, data brokers, data breaches, that really focuses on the data collection that's happening today. I fundamentally believe that our digital exhaust, our personal information, is being way over collected and it's now being weaponized in a post-abortion America, and that weaponization, I think, is going to extend to trans, LGBTQ America, etc. So, one thing we need to take into context is there have been monopolies in the past, like Standard Oil. Standard Oil was incredibly powerful in the 1900s or 1910s, etc. But they didn't know everything about us, and so that's the big difference and we'll talk more about data brokers; but there's been identity theft associated with data brokers, fraud, of course there's the ability to use that data to discriminate against people, etc.

Tom Kemp:

The second grouping of chapters really focuses on artificial intelligence. How is Big Tech using artificial intelligence? Take in account that the AI is using all that data being collected and it's not only automating decisions, but we now have seen the use of generative AI to create text, images, etc. But one of the ways that big tech is using it is to keep people on their products, to make their products more addictive, to get more attention. So, we're obviously in the attention economy where we're competing for eyeballs. I talk about how the use of AI has exasperated problems in terms of extremism and disinformation, but also really taking a look at how AI is being trained on children and the risk associated.

Tom Kemp:

Then, finally, the last chapter is "competition. What I really talk about there is that these companies are monopolies. They really own some large key digital markets. The issue there is their monopoly positions actually exasperate the problems with digital surveillance and data over- collection and AI exploitation and bias. And so in isolation not factor in the fact that these are monopolies; and, you got to look at the antitrust aspect because you're not going to get changes unless there's some sort of pressure for them to change their business practices. My goal of my book with these topics is not simply say here are the problems. I actually want to provide solutions, both for consumers, but also for policymakers as well. So I want to say that things can be done to actually make these problems better.

Debra J Farber:

That's actually a really good point. I am asking you to frame the problems so that people go and buy your book, but I have not asked you to - there's a plenty in the book about how to address these problems and your approach and what you would do. So that is so much of - that's a lot of benefit to the book itself, so I definitely think people should go out and get it. When is it available to purchase?

Tom Kemp:

It's actually available now for pre-order and the ship date is mid-August, and it will be available not only in hardcover but ebook. So you've got Kindle etc. But it's also going to be an audiobook, so you can get it on one of your favorite audiobook providers as well. So it's going to be on all types of media and it's available in mid-August.

Debra J Farber:

Excellent. Well, I'll definitely put a link to the pre-order in the show notes so that people can easily find it. Now, Tom, I know you've been working with others to rein in big tech with proposed legislation. There's quite a bit that you've been working on, So tell us about what legislation you've helped get past so far and what you're working on now.

Tom Kemp:

Great, thank you. Yeah, so look, it's very difficult to change things in Washington, DC. And if you really look at where the activity has been happening, it's been happening in Europe. The "Brussels effect. Obviously, the EU was the first with the GDPR and now we see coming down the pipeline the Digital Services Act, the Digital Markets Act and it looks like the Artificial Intelligence Act is going to pass. So, clearly Europe is setting the standards And then, in the U. S., it's historically been California. It's called the "California effect, and for good reason. I mean, California has led the nation on consumer protection. You can go back to auto emissions etc. Specific to privacy, which I know what's really of interest to your audience is that California actually put into the Constitution in the early 70s that privacy is a fundamental right. The word privacy is not in the U. S. Constitution, but it is in the California Constitution And California has led. For example, California was the first state to have a data breach notification law. And, what happened right in the GDPR timeframe

Tom Kemp:

that the California Consumer Privacy Act, or CCPA, was passed by the legislature And frankly, they were kind of forced into it because a gentleman by the name of Alistair MacTaggart was going to put it on the ballot - and he was the person that wrote it. So, it passed, passed unanimously. It passed in 2018 and went into effect in 2020. But what Alistair immediately saw that the industry was trying to water it down, so he decided he would upgrade the CCPA with the CPRA (the California Privacy Rights Act), and he knew that he could not get it through the legislature. So, he did a ballot initiative, which became Prop 24 in the 2020 campaign, and that's where I hooked up with Alistair as a full time volunteer on the campaign and worked on the campaign full time as a volunteer for six months. And, I was basically the Chief Marketing Officer, and I'm very proud of the fact that it passed. It got over 9 million votes. And so, I think that tells the audience and people that there's a significant appetite with consumers for more privacy. And I think one of the most significant things that was added was that California now has a dedicated privacy agency, the California Privacy Protection Agency, which, when it gets fully staffed, will have more people in it than the FTC, the Federal Trade Commission, has for privacy.

Tom Kemp:

And then, lately I've been working on the California Delete Act. I actually proposed this bill to my local State Senator, Josh Becker, and I co drafted it and been actively working on, and this involves data brokers. So, just stepping back, companies we directly interface with that collect our information. you can call that flexion "first party data. So, you can contact Walgreens or Walmart, say, "hey, i'm a customer and you can exercise your CPR or CCPA data deletion rights etc. But the problem is that there are entities called data brokers that we don't have a direct relationship and we don't know who they are. And so I'm going to call that third party data, and they collect a lot of sensitive information like our precise geolocation or medical conditions etc. Now California and Vermont have a registry, but the onus is on the consumer to go to each and every one and say "please delete me, but that could take hundreds of hours. So I thought it would be cool to have a single website where you can say, hey, go and delete my data. It's kind of the equivalent of the FTC do not call registry that

Tom Kemp:

last year, Senator Assoff (who' s a Democrat) and Senator Cassidy (who's a Republican) proposed at the federal level as well. So, we kind of took that as the basis and behind the back of our heads, we thought, the FTC Do Not Call Registry has over 240 million Americans using it. It's incredibly popular. And there's some federal stuff. So, the current status with that is that it passed - the California Delete Act, Senate Bill 362 - passed the California Senate and it's now on the assembly and it needs to work its way through, etc. So, it's going to be a long and tough road, etc.

Tom Kemp:

And then, the final thing is I actually a branched out beyond California. I worked on a Texas data broker registry bill that was signed like a week ago or so, and that's 2105 and that actually makes Texas the third state to have a data broker registry law. Obviously, with the California Delete Act, we're trying to go well beyond just having data brokers register but provide the portal to allow consumers to do a global delete of their information as well. So, yeah, very active and really kind of where things are happening is at the state level And I know your listeners know that last year we had five states. We're now up to 10, with Texas being the 10th. Oregon is probably going to sign. I think Delaware is out there. So, you know, it's the full employment act for privacy people with 12 different state laws, potentially by the end of the year and in play.

Debra J Farber:

Yeah, that is quite remarkable. It kind of reminds me of the slow trickle of the data breach response acts in each of the states. We couldn't get one federal data breach law on this is how you respond to a data breach. Right? Like, if we can't get something as simple as just a data breach and how to respond to respond to it as a federal law,

Debra J Farber:

do you think that will ever see a comprehensive US privacy law? I'm really pessimistic that we'll ever see one, mostly because the reason we don't have one now has nothing to do with privacy. It has to do with: Who's covered by the law. Are you exempting a federal government from that? Do you have a right to sue? These are some of the issues that Republicans and Democrats can't seem to agree on to come up with and pass a federal law that would apply to all the states and then render the state laws no longer viable. So, then the other thing is that state laws usually have the right to add more protections than a federal law has and go beyond, but it is possible for a federal law to prevent that as well. And so these are things that are being argued to death; and I just don't see it ever happening. What are your thoughts?

Tom Kemp:

Yeah, well, you did a brilliant job of talking about the path of the data breach notification laws. California passed theirs and it took like 17 years later before the 50th state - I think it was Alabama or something - to add that to their law. And then, you're right. I mean, if you look at the 50 different data breach notification laws, they have different requirements, et cetera, and it's a complete patchwork. And during the 2020 campaign, I was part of a policy group with one of the political campaigns, and I actually, in this working group, I said, h"ey, let's add to the platform that we would have a national data breach notification law, and people were like I'm not so sure about that And so that kind of like didn't make it and et cetera. And so, it's difficult, right? I mean, the last major piece of major privacy legislation was what? HIPAA or Gramm Leach Bliley? And, that was more sectoral and that happened in the 90s.

Debra J Farber:

Yeah, that was before I even started in privacy. That was like forever ago.

Tom Kemp:

It was before Meta / Facebook formed. It was before the iPhone and the whole mobile revolution. We've had the complete growth of Big Tech occur in a completely unregulated way and manner. At the same time, we've also, over the last 30 / 40 years, we've had a loosening of the antitrust enforcement and laws. These tech companies have been able to basically make 600 plus acquisitions without them being questioned or stopped, et cetera.

Tom Kemp:

And so now, we've woken up to a situation where people, in an unfettered manner, can collect our information. And, I think the best example of that is health care information. Yes, we have HIPAA, but that only applies to covered entities. But what if I decide to start a startup that's a health care app? Right? I can collect the information; as long as I put in my privacy notice that I will turn around and sell it, I can do anything with it. And that's a fundamental problem. But, I am hoping that eventually it will get to the point where we hit 15 / 20 states that have privacy laws, that people will eventually have to say, "Hey, we finally need to get this across the goal line, et cetera.

Tom Kemp:

But, there are a couple of key sticking points that you brought up, the first of which is the 'private right of action.' Right? That's always been a sticking point. Republicans don't want people to be able to sue, while the Democrats are interested in that. I think you can actually find compromise. You can maybe limit that to identity theft or something. I think there could be a compromise. But the other issue, of course, is 'preemption,' and the Republicans want a federal law to basically represent the ceiling while Democrats typically want it to be the floor and allow states to be able to innovate. And it will be more difficult to be able to find a compromise. Although, I have a few ideas about that; but nonetheless, that's kind of where we're at right now.

Debra J Farber:

Yeah. So, OK. So if there were comprehensive U. S. privacy law, you talk about in the book, some of the ingredients of what should go into it. Do you mind ?

Tom Kemp:

Yeah, sure. I mean, well, first and foremost, consumers should have privacy rights. Obviously, they're not spelled out in the U. S. Constitution, so we need to be able to actually document them in legislation. I think the GDPR has always represented the 'gold standard," And they do a very nice job. And the CCPA, I think, had about 60% matching to the GDPR consumer privacy rights.

Tom Kemp:

And then CPRA added a few. For example, the CPRA added the 'right to correct,' for example. And the fundamental thing, at least in California, is that consumers should have the 'right to know' (i. e. what information is being collected) and they should have the 'right to say no to the sale of their information.' And, California even introduced the concept of 'sensitive personal information' as well. But, I really think where this needs to go is, if you look at AI, the Biden administration proposed an AI Bill of Rights. And, if you look at both GDPR and even CPRA, the focus really has to do with "Hey, you should have the right to reject automated decision making That you basically say "I don't like that decision because it was all done by algorithms, et cetera, and you created a profile of me and you inferred information and you made the decision. Now, that's actually in the GDPR, the CPRA that needs to be designed as part of the regulation process, and the California Privacy Protection Agency has not yet written the actual regulations in that specific area as well.

Tom Kemp:

But the question is, is the GDPR, even with the CPRA with its focus on automated decision making, is that out of date already because we now have seen AI be used in a generative sense to create images and text? And, so what rights do you have if people use your image, your text, your content, et cetera? So, I talked about some of the rights that people should have as it relates to AI, beyond the traditional privacy rights. So, they're kind of like merging together. And, I think we need to kind of go where the, using a Wayne Gretzky term, you know, "ass the puck to where the player is going to be versus where that the player is right now, and I think we need to factor in AI and maybe broaden the definition of the 'right to reject automated decision- making' to include more of the generative uses of AI that could be used in a discriminatory manner as well. So those are the stuff on the consumer privacy rights that should definitely be in a U. S. privacy law.

Debra J Farber:

Yeah, thanks for that. That's a pretty comprehensive list that really resembles kind of a GDPR as well, to the most that you can at the in the U. S. level. But what is always interesting is that we talk about things in terms of 'what is a right.' We have new rights, but what that does for organizations is now create new obligations for the business. So, can you address what companies would need to focus on to comply with the comprehensive U. S. privacy law?

Tom Kemp:

I think kind of the nature of your question is that's great that people have consumer privacy rights, but there needs to be a set of corresponding business obligations and you know business obligations such as: data minimization, prohibit discriminatory uses of data. Obviously, businesses need to respond to rights requests and they should be able to implement appropriate security measures. Obviously, you're very familiar with, in Europe they have to have Data Protection Officers, right? I think that actually in the U. S. should be a good thing. maybe make it for bigger organizations or organizations that have more monthly users; and then, potentially for highly sensitive uses of personal information, there probably should be data protection impact analyses.

Tom Kemp:

But, I think there's two in particular that I want to focus on that I think it's becoming more pressing and it's hitting the headlines .The whole targeted advertising, the behavioral advertising, which digital surveillance pumps data into the whole advertising ecosystem here. You know, if you look at Europe, kind of the next wave of regulations they're doing is the Digital Services Act and the Digital Markets Act And they explicitly ban the use of sensitive data for targeted advertising and they explicitly ban the use of targeted advertising for kids. And, I think we really need to look at that either as part of a child safety law or a revamp of a law like HIPAA that your sensitive data cannot be used. Because, if . you , today being June 30th when we're recording this, this that an article came up from the markup that pharmacies are actually sending -you you know over-the-counter prescription and medical purchases and searches - to Meta via the Metapixel. And, so now there's a link of IP addresses that people have said you know HIV tests that they've searched for or they're purchasing, etc. or plan B - you know things of that nature that's available over the counter but super sensitive. Right?

Tom Kemp:

And so, the fundamental question is is that, should not HIPAA protections be extended beyond 'covered entities,' and also the focus on kids? Should there be a version two of COPPA or COPPA? I don't know how people pronounce it.

Debra J Farber:

Yeah, whatever you want it to be.

Tom Kemp:

But, in version two that Senator Markey is proposing has gotten traction; it actually does, I believe, ban the targeted advertising to kids as well. So, that's one area that I think is a particular interest. The second area, which I would like to definitely have a federal privacy law mandate is support for Global Privacy Control (GPC) because the fundamental problem that we have - and this is a problem in Europe as well - is that it's death by cookies. Right? Every time you go to a website, accept cookies - yes. But if you say "No, then you get this page that pops up and you want analytics, you want marketing, blah, blah, blah, blah, blah, and it basically becomes a big old dark pattern that you're like. "I just want to see who won the football game today! I don't want to spend five minutes on each website accepting cookies and doing all these things," etc. And so there really needs to be an opt- out signal, and the fundamental issue that we have is that privacy is too hard for consumers, that they have to constantly tell the organizations "don't collect information, don't collect and doing the accept the cookies and blah, blah, blah, blah." They should be able, through their browser or on their mobile phone just say "No. I don't want my data sold or shared," etc. And that should just be a setting that you have in your browser and businesses should respect that.

Tom Kemp:

And the other way to make privacy incredibly simple is for the third party entities, the data brokers, to have what I described in the California Delete Act, which is the ability to go to a single page and say "Delete me and no longer track me. As well. Just imagine if everyone had to support Global Privacy Control. Then, you just set something in your browser on your phone and you take 30 seconds to do that, and you cover all the first party data. And then, if you go to a website and put your email address and some other identifiable information about you as it relates to data brokers and hit "submit, that covers you for all the third party data: 30 seconds on your phone and your browser, 30 seconds going to a website. That would fundamentally address the pain and suffering that consumers have to go through with this whole cookie whack-a-mole thing or trying to worry about well who's really collecting and selling my data.

Tom Kemp:

Because the reality is data brokers are entities that we don't even know about because we don't directly interact with them as well. So, I think fundamentally the way that we need to look at it like tech companies design their products from a user interface perspective. They start like "what's the best user interface? We tend to design privacy backwards, like from the back end, and it's painful for the consumer. We should redesign privacy products to like how can we make this as friendly as possible to consumers? And frankly, a lot of the Big Tech companies don't want to do that because their business model is collecting as much data. And if we made it friendly, but at some point we got to put our foot down and say privacy, even though we may technically have the right like in California, people are not taking advantage of it And we need to be able to allow people to take advantage of it.

Debra J Farber:

Yeah, I mean you've definitely said a lot of interesting things there. I mean, the first one that comes to mind is with GPC, Global Privacy Control. One of the downsides I see to it is, I mean, it's called Global Privacy Control, but if you're opting out of something via the web, it doesn't opt you out from mobile or from other ways that you're interacting with that organization. So, it may have the false - a person, might come away with the false sense that they opted out of a particular company completely, but it's really just through one channel. So I would love to see some improvement around the GPC kind of methodology of maybe there's a way to, on the back end, they can link your accounts so that you could delete them.

Tom Kemp:

I 100% agree with you. I mean, GPC right now is kind of a browser plugin, and obviously in a mobile world you use mobile apps. But, at the end of the day you really need the opt out signals to be hard coded in the law, and then you can set the standard and say, "Okay, you need to start supporting that in 2028. And then the technology will then catch up from there. But the key thing is there's not consistency with the current set of privacy laws at the state level, whether or not you actually have to enforce the Global Privacy Control. I would like to see that as a standard at the federal level as a business application, and then that will drive the technical challenges of mobile versus browser etc.

Debra J Farber:

Absolutely. I think it's an excellent first step, a major step in fact, because that's what's going to get the attention from Big Tech resources, people to start thinking about tech solutions, much like with the death of third party cookies and Google coming up with new ways to look across market segments like Flock and others. even if there've been some missteps in there, they're at least putting a lot of minds together to try to tackle the problem due to what's in the law, here at GDPR. But still, I definitely think it's going to move the needle. So how would we enforce this law in your dream world if we had all of these components of these rights set up and these business obligations for a federal, comprehensive U. S. privacy law? How should it be enforced?

Tom Kemp:

I think, look, we need a dedicated supervisory authority. What we now see in Europe is that people judge shopping in the U. S., people are all running to one judge and following a lawsuit in Texas because this judge is the most friendly. And so what we've seen a lot of complaints in Europe is that, o"Oh well, these companies are based in Ireland And there's a lot of political pressure because Ireland's a small country and Meta and some of these other businesses are very large employers, et cetera. And so now you actually start seeing, even with DSA and DMA, more of a centralization, while GDPR made it more disparate with the local countries, et cetera. So, I do think we do need a strong central authority, and that central authority, that agency - could it be part of the FTC? Sure. Could it be separate?

Tom Kemp:

I think there's a lot of synergy with having it right there with the FTC, and it should have the authority, as opposed to having the authority distributed across Attorneys General and 50 states, et cetera. So, I think that should be the case, just like there's one Attorney General in the United States. And, of course, yeah, you could have 50 State Attorney Generals, and you could have statewide supervisory authorities, but there should be a strong central one. Obviously, there should be penalties, the private right of action. I think that you could probably split the difference and say, "Okay, let's just have the privacy right of action like we have in California. That's more narrowly focused, that if the data was stolen that's associated with the ability to actually hack and breach someone from an identity theft perspective, and so maybe you could have the privacy right of action have to do with medical information or hacking or whatever, as opposed to broad.

Debra J Farber:

And just to interrupt for anyone who doesn't know, a "privacy right of action means that an individual has a right to sue the company for malfeasance, as opposed to their only recourse going to a regulator and reporting it and hoping that the regulator prosecutes..

Tom Kemp:

Exactly. For example, in California, if I go to Meta and say, "please delete my information, and they blow me off, I can't sue Meta because they did not allow me to exercise my right of deletion. But, the Attorney General or, starting actually tomorrow, July 1st, the California Privacy Protection Agency, if I raise a stink with either entities and say, "hey, Meta's not letting me delete my information, they could go after them as well. So it's a private right of action, whereas the bugaboo where, as opposed to waiting as you said for a regulator. But, the bigger issue with the federal law in the areas of enforcement, have to do with preemption. And I'm actually of the mind which is that Washington moves at a glacial speed And it's been 40 years since we've had anything privacy related with HIPAA and Graham Leach, in terms of the significant way. Even if we passed a federal privacy law, it would be so hard to amend it, et cetera. So, I actually prefer the states be the labs of democracy, quoting Justice Brandeis who said that 100 years ago, and allow states to be able to use that as a floor, especially because technology moves so fast.

Tom Kemp:

But, I understand that's a bone of contention, But maybe there's a way you can do it, which is maybe you can pass the federal privacy law and say for the first few years there's no preemption, but after three or four years states could preempt it.

Tom Kemp:

That maybe could be a good enough compromise, although federal government basically says t"his is it, take it and that's it, and you can't add on it as well."

Tom Kemp:

So, maybe there is a way I haven't really thought this through, but maybe there is a way to compromise as it relates to preemption or give maybe California a carve out because California historically has been the most aggressive and set the standards, et cetera. But, I do fundamentally know that, especially when it comes to consumer protection, it is important - that I am of one that when push comes a shove, that the federal government should not preempt states when it comes to privacy, given the fast nature of technology and the slow speed in which the federal government acts as well. So I think we always need to have that lab of democracy, as Brandeis said, that happens as well. So, those are the private right of action and preemption have always been the bones of contention, and maybe in the end we're going to need one side or the other to have a decisive victory in Washington to finally just shove it through a federal privacy law.

Debra J Farber:

Yeah. Well, I'll keep my fingers crossed, but I won't dedicate too much brain space to following that just because I think it's going to take forever. But, I think that the approach that you're taking is worthwhile, even if it doesn't get passed, because it raises the issues comparing it to state legislation. I really do hope that we have federal privacy law. I think everybody says that. The challenge is what's in that federal privacy law to get it passed. But we'll see. These are interesting times. Yeah, absolutely Okay. So let's switch gears from the book and talk a little bit about your work as a Seed Investor with Kemp Au Ventures. What do you look for in privacy tech or security company before ?

Tom Kemp:

So, I've been very fortunate that I had good success with some of the companies that I started with. And, being here in Silicon Valley, it's very common that you have friends that are starting new ventures or people in your network, and you want to give them some Angel money. Just like, maybe if one of your listeners wanted to start an ice cream store and then they got some money from their Uncle, Larry, who's a dentist, that puts some money into it as well. I'm not a venture capitalist. I'm just a guy that works with a business partner, Adam Au, and together we invest some money in startups. So far, we've done 15 investments that are active right now. And obviously, given my background, it's very much security and privacy tech related. And so, what I look for in a company is that, first of all, starting a company is a team sport. You need great co-founders. Oftentimes someone comes to me and says I've got a great idea, and I'm like, "Well, who else is working on it? And it's like, well, i'm the CEO and if I get a bunch of money, i'll hire a bunch of people. But no, you really need to have a team. You need to have some Co-Founders working with you, because if that one person gets, God forbid, gets hit by a bus, then everything goes away. But, if you have a team to kind of bounce ideas off and be able to sustain the business if one of the founders steps away, then it makes it more sustainable.

Tom Kemp:

The second thing is you really need to pick a product that solves a real pain point And you need to make sure that, whatever company that's forming, that their solution is a pain killer, not an aspirin. And oftentimes there's just too many nice- to- have products, and you really need something that customers are in a great deal of pain and they're willing to bet their time and their money on a small startup and go out on a limb and buy that product because it will really fundamentally help the business. So, no 'nice haves.' You got to have a 'must have.,' And so, I spent a lot of time trying to figure out is this a real pain that this product solves? Obviously you want a large market, not only to help you raise money but gives you a room to make mistakes and still succeed.

Tom Kemp:

But oftentimes, I think people end up having a little niche product and then they say, well, it's part of this huge privacy market, but you need to be realistic. Are you really just a little tiny, tiny niche or is there really a sizable market that if you multiply the number of people that would buy your product times the average revenue that actually could build, say, $100 million business in five, six, seven years? And then, the other thing - probably the last thing - is it can't be too crowded of a market. I have people come to me and say, "Oh, i'm going to build a better product than these other four to five startups, but the other four to five startups have raised $50 million and they have $0 in the bank and they just have a little faster widget, et cetera, and it's just like boy, that's going to be really difficult to succeed.

Tom Kemp:

So you want to find a market that's going to be big enough, but it's at the right time; t's not as competitive as well. So, those are some of the things that I look for, which is: Is there a team? Is it a must have? Is it a large market? But, is it not too crowded of market for the market segment that they've initially picked?

Debra J Farber:

Yeah, that's great. It's really insightful. And I know what is the check size you normally give out. I know this isn't just $10,000, $25,000 checks, Angel- level. You're doing seed investment. So, how much scale a company can do. So, of course you're going to care because you're putting some significant money down.

Tom Kemp:

Yeah, no, I mean, I think or some companies that need some more money, then I will bring some other business colleagues and friends to the party. And so, yeah, typically I'm going to call it a consortium

Tom Kemp:

But, if they need more money, then typically they should really probably get a early stage VC, which has institutional money put into it; and there's a bunch of seed VCs out there that have $30 / $40 million dollar funds, and those seed VCs may put like $1 million / $1. 5 million, and then we would put in a couple hundred thousand. And so, we would be kind of like the second leg or the third leg of the stool. So, I mean, it really depends on kind of how early they are, the stages. But we've written some bigger checks collectively, but sometimes I've written $25,000 checks just to be part of a consortium, et cetera. So, it really depends on what the company's needs are and how excited I am, et cetera, but it's fun because here I'm in Silicon Valley and this is where people make things happen, and it's great to be on the ground floor, working with entrepreneurs to get their dreams and visions funded.

Debra J Farber:

I think that's great. We're cut from the same cloth, I just don't have the money you do to invest, so I invest with sweat equity. So I want to talk about some of the privacy tech companies that you invested in, like Secuvy, Privaini and Privacy Code, and what inspired you to invest in them. But, I first want to disclose to the audience that I'm on the Advisory Boards for Secuvy and Privaini, and I am a Angel Investor in Privacy Code, so we both have vested interests here in these companies.

Debra J Farber:

But, I am curious about, you know, two of those companies - Privaini and Privacy Code - are kind of creating their own category in privacy tech. Right? You were talking about before, you don't want to invest in companies where there's a crowded market, but it does seem like you invest where there is a brand new category and that has its own set of challenges - to be heard in the marketplace, where exactly do you fit into the needs. And, it's a lot of what I help them as an Advisor on is go- to- market fit and how do you get it stand out and be heard amongst the other privacy tech and security companies out there? You know, I guess this is to say I understand the draw of working with eager startups with really great ideas, but I'm still curious about what inspired you to invest in them and companies like them.

Tom Kemp:

Oftentimes you look for analogous markets or segments that pre-exist and they're popular there and then as you go to a new regime or new industry, would this be applicable. So Privacy Code really focuses on Privacy Engineering, baking that privacy into the product development process, etc. And, in the security world that there was the whole DevSecOps and Security Engineering teams that people 15 / 20 years ago realized that "wait a minute, when we build software code, we need to build security into it. For me, when the founders of Privacy Code presented it was like my light bulb went off. Which is what motivated and drove the early security market was PCI-DSS, etc. And then, there was a bunch of issues with code not being secure and engineers had to be more security conscious and build it in. I felt that now that we have Gramm- Leach- Bliley - I mean, sorry, we have GDPR. We have CPRA. We're going to have DSA, DMA. We're going to have Kids Design Code like the California AADC - the Age-Appropriate Design Code - but software engineers have no clue what they need to do and build and how they handle information, and so Privacy Code uniquely provides kind of a library that helps engineers build, and so that's really kind of the - your main concept is that 'shift left' and that we need to shift the privacy into the building of products.

Tom Kemp:

Similarly, in security, there's been security risk analysis score cards, etc. Is this vendor secure? Is this product secure? What's the code? What's the rankings, etc. And Privaini, with their CEO, Sanjay, does the same thing for privacy in terms of looking at public privacy policies. They look at historical breaches that have occurred and it will allow, for example, people in the purchasing department or the Chief Privacy O fficer, when they deal with third parties, to be able to do an assessment, and I thought that was just brilliant. And I saw the success that companies have had in security and this needs to apply to privacy as well, because how do you actually quantify how good a company is as it relates to privacy? You need to eventually apply a score, and they do that.

Tom Kemp:

And then, the other one that we great minds think alike with working with these companies, Secuvy, I really like the fact that they really focus a lot on the use of artificial intelligence and they really focus on unstructured data, and I really felt that AI actually could be the smart. Intelligent use of AI could actually add that to the mix and build it from scratch, as opposed to, after- the- fact try to bolt AI on. Build it from Day 1 with the intelligence, the machine learning etc. and then applying it to finding PII and unstructured information, which is the majority of data is actually unstructured. I thought that was just a cool idea as well. So, it's just kind of a lot of the reasons why I invest is like, "Hey, this has been a great idea, say in security or some other space, and now in this new world of privacy, being driven by all these laws and regulations and consumer awareness, those things applicable in the privacy world - if yes, then there's a great opportunity there.

Debra J Farber:

Yeah, that makes a lot of sense. I'm equally as inspired by, obviously, this is called "he Shifting Privacy Left podcast. I believe that we need to shift left into engineering DevOps These tools really help with that and managing information earlier on in the organization, so that makes a lot of sense. Thanks for sharing your insights there. I guess the last question I have for you is do you have any advice for how companies can better shift left in their organizations and within their business networks?

Tom Kemp:

Look, the privacy market is incredibly dynamic and there's so much activity. If you took a six-month leave of absence and then you came back to work, you're like, "oh my gosh, there's like five more states, right. And so I really highly recommend that if you're going to be a privacy professional, you can't rest on your laurels; and people need to listen to people like yourself with your podcast, to keep what's up- to- date. They need to read the daily IAPP newsletter and participate in the forums on LinkedIn, etc. and be able to carve out time every day or every week to make sure that you're keeping abreast of where things are going, etc. Because, unfortunately, in privacy, it's very fragmented. Right? There is not a national privacy law. There are 10 state laws. There's 50 data breach notification laws. There's Europe as GDPR, but DSA Digital Service Act, Digital Markets Act are going to apply to some companies as well.

Tom Kemp:

So, from my perspective is that I believe organizations and individuals should have dedicated people focused on privacy, that they should have privacy, to your point, not being after- the- fact thing, but needs to start baking that into the building of their products, how they interact with consumers, etc. because what's happening now is that consumer expectations are sky high. They've been burned. Consumers have been burned too many times with data breaches, with Cambridge Analytica, etc. and so they have high expectations.

Tom Kemp:

And again it goes back to what I said before; think about it from the consumer perspective. Too often we build it from like what is our perspective, what our needs are, and you need to kind of flip it around right, and so my suggestions are have dedicated people, carve out time to keep up- to- date, abreast of what's happening, because it's incredibly dynamic. If you don't like it or whatever, then this is probably not for you. Sorry. You've got to keep up with what's going on at the state level because you may have a couple thousand customers in Texas and HB4 just passed and he just signed it and it becomes effective next year. So, guess what? If you want to avoid a lawsuit from some of your consumers in Texas, you've got to follow Texas privacy law.

Debra J Farber:

I mean, and just to hang a lantern on that and to underscore it or whatever, pick your metaphor. I think that applies to the profession generally, beyond even just new laws. You're constantly having new standards, new frameworks, new technology that's arising and causing new problems, that's causing new points in time where you need to think about your approaches. Should you scrap what you've already built because it doesn't comport at all, you can't make it legal, or even beyond legal, customers won't find it trustworthy anymore? Or, should you buy something elsewhere that just makes your privacy better? or you need to deploy a new architecture or governance? I mean, it's constant learning. I can't imagine someone successfully being in privacy, whether it's an engineer, a lawyer, an operations person, an architect, what have you, without having to constantly learn and constantly reassess what your approaches should be. For me, it's part of why I'm drawn to this space. I'm never bored. There's so much; but, now I do have to say it can be overwhelming. If you're looking broadly and not within a smaller domain, within privacy, like I do, you can get overwhelmed with the deluge of information because there's so much going on in the space.

Debra J Farber:

As you said, customers, people are moving the markets these days. It's not, "Oh, another find for a data breach, another fine for a data breach." That was just humans were getting used to that. It was sad. That was not causing any change, but, as you said, the realizations that we are stuck in a surveillance capitalism, Big Tech spiral, how do we get out? It is, i think, consumers who are looking to feel trust, want to trust companies or not give data to companies they don't trust, or not buy something from companies they don't trust. And that, I feel, is driving a 'shift left' movement in at least creating products that consumers will trust better. Now, hopefully, they'll just continue to be a trend and there'll be more transparency, but it's going to take laws to also keep them in line.

Tom Kemp:

It's a dynamic market and it can be challenging, but the good news is, if you can keep up with it, then you become even more valuable and strategic to the company and in your career.

Debra J Farber:

Absolutely. 100%. Thank you for adding that. Well, Tom, any last words to our audience before we close?

Tom Kemp:

No, I think we hit on a lot of stuff, and I really appreciate you interviewing me. This has been amazing. This has been a great podcast episode here. If people want more information about the book, they can go to tomkemp. ai, or if you just want to go ahead and pre-order, go to containingbictech. com.

Debra J Farber:

Excellent. Well, Tom, thank you for joining us today on Shifting Privacy Left to discuss your new book, what a comprehensive U. S. federal privacy law should look like, and some of your privacy tech investments. Until next Tuesday, everyone, we'll be back with engaging content and another great guest.

Tom describes U.S. state privacy legislation he helped get passed & what he's working on now, including: CPRA, the California Delete Act, & Texas Data Broker Registry
Tom and Debra give their thoughts on whether there will ever be a U.S. federal, omnibus privacy law
Tom explains what should be included in a U.S. federal privacy law there were comprehensive U.S. privacy law
Tom explains why a federal U.S. privacy law should require the use of Global Privacy Control (GPC)
Tom explains how a federal privacy law should be appropriately enforced
Tom describes his work as a privacy and security tech Seed Investor with Kemp Au Ventures and what inspires him to invest or not
Tom describes why he was prompted to invested in privacy tech companies: Privacy Code, Secuvy, & Privaini
Tom's advice for how companies can better shift left in their orgs & within their business networks

Podcasts we love