The Shifting Privacy Left Podcast

S3E8: 'Recent FTC Enforcement: What Privacy Engineers Need to Know' with Heidi Saas (H.T. Saas)

March 26, 2024 Debra J. Farber / Heidi Saas Season 3 Episode 8
The Shifting Privacy Left Podcast
S3E8: 'Recent FTC Enforcement: What Privacy Engineers Need to Know' with Heidi Saas (H.T. Saas)
Show Notes Transcript Chapter Markers

In this week's episode, I am joined by Heidi Saas, a privacy lawyer with a reputation for advocating for products and services built with privacy by design and against the abuse of personal data. In our conversation, she dives into recent FTC enforcement actions, analyzing five FTC actions and some enforcement sweeps by Colorado & Connecticut.

Heidi shares her insights on the effect of the FTC enforcement actions and what privacy engineers need to know, emphasizing the need for data management practices to be transparent, accountable, and based on affirmative consent. We cover the role of privacy engineers in ensuring compliance with data privacy laws; why 'browsing data' is 'sensitive data;' the challenges companies face regarding data deletion; and the need for clear consent mechanisms, especially with the collection and use of location data. We also discuss the need to audit the privacy posture of products and services - which includes a requirement to document who made certain decisions - and how to prioritize risk analysis to proactively address risks to privacy.

Topics Covered

  • Heidi’s journey into privacy law and advocacy for privacy by design and default
  • How the FTC brings enforcement actions, the effect of their settlements, and why privacy engineers should pay closer attention
  • Case 1: FTC v. InMarket Media - Heidi explains the implication of the decision: where data that are linked to a mobile advertising identifier (MAID) or an individual's home are not considered de-identified
  • Case 2: FTC v. X-Mode Social / OutLogic - Heidi explains the implication of the decision, focused on: affirmative express consent for location data collection; definition of a 'data product assessment' and audit programs; and data retention & deletion requirements
  • Case 3: FTC v. Avast - Heidi explains the implication of the decision: 'browsing data' is considered 'sensitive data'
  • Case 4: The People (CA) v. DoorDash - Heidi explains the implications of the decision, based on CalOPPA: where companies that share personal data with one another as part of a 'marketing cooperative' are, in fact, selling of data
  • Heidi discusses recent State Enforcement Sweeps for privacy, specifically in Colorado and Connecticut and clarity around breach reporting timelines
  • The need to prioritize independent third-party audits for privacy
  • Case 5: FTC v. Kroger - Heidi explains why the FTC's blocking of Kroger's merger with Albertson's was based on antitrust and privacy harms given the sheer amount of personal data that they process
  • Tools and resources for keeping up with FTC cases and connecting with your privacy community 

Guest Info



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

TRU Staffing Partners
Top privacy talent - when you need it, where you need it.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Heidi Saas:

They want to know what data segmentation you have in control to make sure that the information collected is only used for that purpose. So, they've got the purpose limitations built in here, as well as a data retention limit set. They did not say how long. Previously in the location data case they said five years was too long. You can read between the lines on that. But in data retention, you've got to have at least one, and then tag your data with that, because you're required to have the technical means for achieving deletion and you've got to be able to demonstrate that. That is where the privacy engineers have longevity, for work right there - all day. We need people to come in and know how to do this.

Debra J Farber:

Hello, I am Debra J Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans, and to prevent dystopia. Each week, we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding- edge of privacy research and emerging technologies, standards, business models, and ecosystems. Welcome everyone to The Shifting Privacy Left Podcast. I'm your host and resident privacy guru, Debra J Farber. Today, I'm delighted to welcome my next guest: Heidi Sass, Founder of H. T. Saas, LLC, where she serves as an attorney focusing on privacy and technology.

Debra J Farber:

Heidi is not afraid to publicly point out when companies are behaving badly and makes public calls for them to do better. She states on her LinkedIn profile, "I rip tech tools apart, sometimes to make them better and sometimes to light asses on fire. I work with tool makers, startups, VCs, small business owners, research groups and some nice people I don't talk about. I understand people and systems. I am not cheap and I will not do box ticking duties nor marketing or sales for your business. I love it. She's like a hero of mine.

Debra J Farber:

Her advocacy has been so helpful to the community and she's pretty much always right about everything she says. She comes with receipts, and today we're going to talk about recent Federal Trade Commission (or FTC) enforcement actions, as well as some recent updates and information about enforcement sweeps from a couple of U. S. States, specifically Colorado and Connecticut. Heidi recently did a LinkedIn Live event with privacy and security communications expert, Melanie Ensign - also a good friend - and, I know her to be just a tremendous powerhouse in that space of communications. Melanie's the CEO at Discernible, and their LinkedIn Live event goes over the same enforcement actions we're going to talk about today, with the added benefit of Melanie's wisdom around communications. After finding that presentation so informative, I just knew I had to bring Heidi on to The Shifting Privacy Left Podcast to talk to our community of privacy engineers, especially since these actions will likely affect your companies and how they architect for privacy going forward. So, Heidi, welcome. I'm so excited to chat with you today.

Heidi Saas:

Thank you so much for having me. I'm very excited to address your community of people. I am not an engineer, just to start with that. I'm just a lawyer, just another suburban mom on social media.

Debra J Farber:

Oh, I think you're selling yourself short. But yes, you come from the legal background, as do I, but your focus has absolutely shifted to privacy. In fact, you didn't start out in privacy, so why don't you kick off the conversation by telling us a little bit about your journey into privacy law and then why advocating for privacy by design and default has been so important to you? And, you might also throw in why you hate data brokers so much.

Heidi Saas:

Well, thank you for the introduction and for the kind words. I really appreciate that. I also have a lot of respect for what you're doing in the community and bringing people together and educating us. We learn so much when we listen to each other. I'm honored to be on your show.

Heidi Saas:

My journey started with - I'm first in the family to go to college. I grew up in a blue collar community and I saw there are several different components to power. There's money in the law primarily, so I wanted to find out how does that work. I went to undergrad in DC and I worked at a lobbying law firm and quickly figured out how that works. Then I went to law school and started working consumer rights for consumers. At that time, it was a financial crisis and you could just buy spreadsheets of accounts and people were collecting on it and using the court to garnish wages, and it was a disaster. We ended up lobbying for Dodd-Frank and the Card Act and reform in the state legislatures on wage garnishment and bank levies. It was a mess. At that time, the ability to repay algorithms being used in finance were not really based on science because they didn't really have any regulations that said, here's what ground truth looks like or construct validity when you're dealing with algorithmic bias and discrimination. So that had to be handled. Dodd-Frank came in; the CFPB was established. They started to standardize things and get some rules in place and things started to get better.

Heidi Saas:

But, after working in consumer rights and seeing the things that I saw and oh, the many shady areas that I had to traverse, I decided that I've got to start working more on data. I found privacy because I said, "You know I've got to start working more on data. And I found privacy because I said that's the only place where I see we can have agency over the information being used against us in all these life critical areas. So I did, oddly enough, a Google analytics course. They used to have them posted for free and the funneling and lookalikes and how you set up targeted advertising and I was appalled.

Heidi Saas:

I was shocked - "how can this be? Do other people know this is happening? I was so shocked I couldn't believe this was happening. So I said "I've got to learn more. Not because it was just Google that inspired me, it was just the level of harm that was about to become apparent because it was so hidden behind complexity, people didn't even know why things were not happening for them. There was no explainability. There are certain things that are required under the law, like adverse action notices' when information is used against you in a hiring decision and it tells you what data was used against you so that you can have a right to dispute it and correct it on your credit report. But nobody's doing that, because they're using data they bought from data brokers and feeding it into snake oil prediction tools and calling everything LLM- powered. Now whether it's actually attached to the LLM in the background or not, right.

Heidi Saas:

So, yeah, we are data. All we are is data. I learned as much as I could about privacy. I took the data privacy exam in 2019 and then started working at the beginning of Covid on doing consulting, which was kind of odd, but for me it worked in the odd kind of way where everyone in the world was forced into my parameters. I was working from home with no childcare and then suddenly everybody else was too.

Heidi Saas:

So, I said "yYou know what? I want to meet as many people as I can find. I want to find the researchers that are writing these papers that fascinate me. I want to find the people that are doing ML engineering and I want to learn from them. All of these things like reinforcement learning. That was integral. I thought that was so important to figure out right after I had just experienced parenting. So these are just some of the things that got me into the path that I'm in now with privacy and ethical AI. It was all based on consumer rights, because that's the only thing we have now to enforce the rights we have in the information being used against us.

Debra J Farber:

That's right. We don't have fundamental rights of privacy, except for Fourth Amendment stuff. It's more based on consumer's rights and only where there's a harm.

Heidi Saas:

Yeah, we don't really have those Fourth Amendment rights. Yeah, for right now you don't have an expectation of privacy and information that you share with a third party, so that they're pretty free to sell it. They're trying to address that issue, but Congress is falling short because it's obviously an election year. But, every year they need the same targeted advertising systems to generate campaign dollars and so, yes, they cannot effectively legislate on this issue without impacting their own standing, their own seat in Congress.

Debra J Farber:

That definitely makes it tough. So, today we're going to unpack six recent cases - enforcement actions, mostly around the FTC (the Federal Trade Commission) bringing these actions, and I'll list them as an agenda for the audience right now. The first is the FTC v. InMarket, where we'll talk about sensitive data. The next is the FTC v. X-mode or OutLogic, where we'll talk about location data. The third is the FTC v. Avast, which is relatively new and that's around browsing data.

Debra J Farber:

The fourth is The People v. DoorDash, so that's really California - the State of California suing DoorDash, where there's a mismatch between promises and the privacy notice and the technical capabilities they had. The fifth is the Connecticut and Colorado State reports and some of the enforcement sweeps that they've done so far. And the last but not least, we'll talk a little bit about the FTC v. Kroger, which has to do with stopping the merger between Kroger and Albertsons. And so, before we dive deep into those specific cases, can you give us a high-level overview of the FTC enforcement for the audience? How does the FTC bring enforcement actions against companies, the effect of their settlements, and why do privacy engineers need to be paying closer attention to these cases?

Heidi Saas:

So, I want to start with I am a data privacy and technology attorney, but these are my personal opinions, not legal advice. If you have legal questions, I would ask that you take notes and take them to your counsel, based on the information that I am sharing with you today for educational purposes. Now the FTC will start. Sometimes I'll get a complaint that will ask for an investigation from an outside third party like Epic or another consumer group to say, hey, something's going on over here. This is what we think is happening. We're asking you to investigate. So that's one way they'll bring it. Another is consumers will bring information and make complaints through their portal, where they have an overall view and they share that. It's a Sentinel Network. They share that with other agencies so that they can get an above- ground, look at what's happening between businesses and consumers out there. And so sometimes they get a mass of complaints about a certain thing - funeral homes and the problems that they had recently with funeral homes. They get a lot of complaints in that area and they say what this is worth investing our assets in doing some investigations. At that point, once they've identified something they need to look into, they send a 'civil investigative demand,' a CID letter. The CID letter shows up and says, "Hhey, we want to talk to you about these particular issues, we want to know these types of things, and you need to set up a meeting with them pretty soon so that you yeah, you meet and confer. You need to do that within two weeks so that you can talk to them and set out a timeline for how you're going to hand over information to answer their questions.

Heidi Saas:

Now, at this point, they may be delighted with what you're doing and everything seems fine, and we're going to use this as the industry standard when we go after other people, because what you're doing covers all the bases and that's great. More likely than not, that's not what they're going to find. They're going to find some issues. They're going to look at your agreements. What are you telling the public? They're going to look at your technology stack, because they have technologists that work for the government now and have been for a couple of years.

Heidi Saas:

But at this point, because enforcement does take a while rulemaking takes a while as well you're starting to see the fruit from these efforts so they can look at what you're telling in the front versus what the backside of your system is actually doing, and if there's a difference between the two, they're of the opinion that is unfair and deceptive. Just flat out, yeah, so there's really no need to go into why it looks this way. It is this way. You have an obligation to know it is this way, and if you don't, that's no excuse. That's why I think it's important for people that are working with data to understand that accountability is the second thing after transparency, and the accountability is the reason why we want transparency, because we want to know who made this decision and who do we need to come for when we see something we don't like that is really helpful.

Debra J Farber:

and then the way that I've seen this play out and I think it's important to say up front is that the effect of these enforcement actions almost is that you've got another set of laws on the books, right, like some people didn't know to a specificity, that now we're going to be hearing like holdings that are pretty much feel like case law, even though it's not technically a case adjudicated by a judge.

Debra J Farber:

In the legal system it's enforcers of the FTC. So it's interesting in its own right that it almost has a rulemaking capability with the holdings. And that's been something that a lot of people have fought against, like why should the FTC with their unfair and deceptive trade practices mandate that people don't necessarily know? The claim is they don't know ahead of time what might be fair, unfair or deceptive. I think that's bullshit. I mean, I think it's kind of clear what's unfair and deceptive if you're lying in your policies to people about what is going on right. And then the other thing is that we're not just talking about great, a case was settled and some fines were paid right. We are talking that any FTC action involves the company, once you settle, submitting to 20 years of audits.

Heidi Saas:

Yeah, that's exactly right. They're also they're creating a deterrent effect and, to answer your earlier question, they do this on purpose to drive home clarifications that they've made in the business guidance. So if you follow the guidance, that's prescriptive measure.

Debra J Farber:

If you follow that, you should be in good shape, and I'm sure we could talk for hours and hours and hours too. So let's, I guess, stay focused on these cases and start with the first one. The first one to discuss is the FTC versus in-market, and the holding states that the data that is linked to a mobile advertising identifier or an individual's home is not considered de-identified. Tell us more about what organizations should take away from this holding and why are we talking about de-identification instead of, like, anonymization?

Heidi Saas:

Yeah well, they said in the agreements our data is anonymized and aggregated, everything's cool, cool, cool. And then when they looked at it they said you know it's not. It's totally not. And in ABAS in particular, they went a little further to explain, like all the different agreements and what was allowed through linkages. It's basically it's like it's not me that's re-identifying it later and violating people's privacy. It's my business partner, like two steps down the way, but you're still using the same ID so that you can link it together later for the purposes of targeted advertising. So this is kind of tell me targeted advertising is not okay in these circumstances without telling me it's not okay in these circumstances.

Heidi Saas:

The FTC Act is very, very old and at some point we needed things like that to tell us that using cameras for peeping Tom purposes was not okay just for salacious publishing in the newspapers. These kinds of cases had to come forward. So at this point we're saying that in this particular instance on in-market, the sensitive information is important because the inferences that you have from the information of them being at a certain place, that's not okay for you to go and sell to all your friends for targeted advertising, because the risk of harm to a human is greater than any potential business case that you have for trying to keep this under wraps and do this. They have obligations for what de-identified means, and this is not the first time we've seen this particular definition from the FTC, but this is the first time that we've seen that extra nugget at the top here about data linked to a mobile advertising. A maid or an individual's home is not de-identified.

Heidi Saas:

Ctv is the next holy grail of advertising. They're going to try to watch you watch TV and try to figure out what advertising means doing this, and they're doing all of that using the home IP address. So I think that kind of a shot across the bow from the FTC, just letting ad tech know I see you Like identifiers, join keys, those sorts of different kinds of workaround systems. It's as if consumers are saying please don't do this, and then the industry says but how about if we do it this way? And it's the same thing. You're not hearing us right now, and so it has to have an economic cost to ignoring what needs to happen, and so that's what these cases are about.

Debra J Farber:

And so the de-identification approach. Is that something under the IAB, the advertising kind of lobby? The IAB's consent framework Just got blown up. Yeah, let's talk about some of the ramifications of it getting blown up, so to speak, that the IAB consent framework isn't considered legit here at FTC but also in Europe. Right, we don't have to go into complexities of the European stuff there. But for instance, we have, like the alternative identifiers. We can't use them, if you don't mind discussing that a little bit. And then cookie deprecation and how that's affected and some of the technical downstream impacts that this holding might have.

Heidi Saas:

Sure, if we look at the definition that they gave us in here in this particular decision in order, the definition section is where it's by lawyers, for lawyers. That's where it's at. I love it, but it has a particular meaning in here. If we look at this, it'll explain a little bit more about why other regulators see the same problem. De-identified information means that it cannot be linked directly or indirectly to a particular consumer or their device, and so you have to have technical safeguards to prevent re-identification. Now if you think about ad tech and you put all that first-party data together and send it off for lookalikes and then send it over to LiveRamp for identification and then this and that that's not preventing re-identification On its face, that is not what you're doing here. It also is no longer first-party data when you give it to somebody else. That alters that data.

Heidi Saas:

The second part is you have to have business processes that specifically prohibit re-identification. So you have to have it in your contracts that say to your third parties you cannot re-identify this data. And then you have to have the processes to make sure that they don't have inadvertent release of the information. You've got to audit every now and then on these third parties to make sure that they're following your contractual clauses here, because if they're not, the liability runs up chain. So deleting the data is hard, de-identifying the data not as hard. But the exact technological method that you use to do de-identification is going to be under review by the FTC. They want to know how did you do the de-identification? Who did the de-identification? Did you test it to make sure it couldn't be re-identified? Did you test your third parties for the same reason to make sure they couldn't re-identify the data? Not until you've jumped through all those hoops, can you say that that data has been de-identified?

Debra J Farber:

Yeah, that's definitely a change. That's the expectation of companies. I would think that almost all the companies out there are, for the most part, certainly not complying or don't have processes that start to list all the answers to those questions, and so definitely seems that privacy posture management is something that companies should be focusing on this year, so that you could, similar to the EU the accountability principle, be able to demonstrate compliance with if you say you're de-identifying stuff that you are actually are, and so these assurances, these attestations, almost, or what you're promising in your privacy notice, has to be accurate to what you're doing. So let's move on to the next case, which is the FTC versus X-MODE made by Outlogic, where the holding focuses on affirmative, express consent, the definition of a data product assessment and audit programs that are now required, and the requirement of data retention and deletion. There's a lot to unpack there. How would you explain the importance of the holding?

Heidi Saas:

I'll start with giving credit where credit's due. Joseph Cox, who was at Vice when he first started looking into X-Mode and is now at 404 Media, has been on this issue with this particular data broker for a while. Byron Tao, also looking into data brokers and was looking into X-Mode. He has a new book out I think it's called Means of Control.

Heidi Saas:

X-mode was one of those friendly little apps that was called Drunk Mode or something. It was trying to help people get home safely at night or something, and they ended up not giving notice of what they were doing, even though if you did give notice to people in one of those long privacy notices for an app like that, people aren't really reading that, I'm pretty sure. So when this case came up, this is something where I believe the FTC wanted to make it known that you just can't track people down just because you have their location information, like they've got to have some sort of privacy in their personal being and where they are and where they happen to be with their devices. So they said if you are going to collect this type of data, then you have to have affirmative express consent, right, affirmative express consent. It requires clear and conspicuous disclosure that cannot be inside your privacy notice, terms of service or other similar document. It has to be separate. This is a separate form of consent, so people know I'm tracking your location and telling everybody else about it.

Debra J Farber:

And that the concept of affirmative express consent has been around for a long time. A lot of European laws have that. That's pretty much their default. There's other US laws that require it. So that affirmative express consent has always been that it cannot be embedded in with other terms. It has to be for each thing you're consenting to, a separate line item that you're consenting to and that you have to actually take an action to. Like you have to go check a box. You can't you know it can't be. Like you know, check this box if you want to not choose your data. Like it has to be the easiest way for someone to actually reflect that they've taken an action to indicate that they're giving.

Heidi Saas:

I'm glad that you mentioned that part right there. Oh yeah, absolutely so what the affirmative express consent is not is using an interface that has the effect of subverting or impairing user autonomy, decision-making or choice. So those are dark patterns. Without calling them dark patterns, you know what that is? What they did find when they were looking into these issues is that they found consent buttons but they didn't link to anything. Or if they didn't find them, they were so heavily buried inside and they were like barely lit and it was driving traffic to click okay, and those sorts of things.

Heidi Saas:

So this is the first time where the FTC said no, clicking I accept is not I accept to everything. You've got to be clear and conspicuous about this type of consent. That's new for ad tech, because the location data that's where you get push notifications. I see you just pulled into the Y parking lot. Would you like to also go get a coffee next door after those kinds of things? Yeah, and it just creeped and creeped and creeped into our life so much this is the regulator saying back back with that, because that was beyond the expectations that consumers had when they downloaded your app and the definition of the data product I wanted to highlight, for people and engineers will especially appreciate this is that it's any model, algorithm or derived data, so that includes the inferences, manual or automated predictions, audience segments that is in here. I ripped this language right out of their decision.

Heidi Saas:

This is new for people and, yes, the trade organizations and ad tech are freaking out about this, but this is one of those things where you've seen it coming everywhere else in the world. Like you said, all the other regulators are coming at them for the same thing. This is just the last stand, like you have to accept. This is where we are now. This is the world you live in and there are downstream notices required going three years back. So people are going to start getting they've already gotten these letters, like people got these letters from Xmode and Outlogic that say, hey, ftc popped us and we had to get rid of a bunch of data. So if you bought some audiences in the last three years, you should probably get rid of that or whatever. So that whole other conversation you might want to have with council if you've got that kind of letter, because, honestly, how are you even going to find that data? That, knowing what I know about what warehouses look like, especially in this industry, like how are you even going to find that data?

Debra J Farber:

to get rid of it, that data Like what is the requirement here? That if you can't remove it from your training set, that you scrap the model and start over. I mean, like I don't see companies actually reading it that way, or they're willing to read it that way, but to comply really would mean that you have to find a way to be able to delete it from your systems, from your models. So I'm sure we're going to see a lot more.

Heidi Saas:

Machine learning is like so hard.

Debra J Farber:

But it is a question as to like to what extent does deletion need to happen and where? But you know, how do we get that flow of data when you're sharing with third parties? How do you control that flow? Obviously, there's ways to do it and there's completely new architectures, and it would take years and years for, like, self-sovereign identity or other decentralized identity architectures that can enable this. But that's not where we really are today, right? So, like, what is the expectation of companies that do hold this data on our behalf and then feeding it into models?

Heidi Saas:

This is going to be back up policies and checklists. So a supplier assessments program those are PIAs a privacy program, sensitive data program were mandated, including assessments, which are audits. That's the cost of forgiveness for this company. Now businesses are looking at it and going well, we don't have to do all that because that's their cost of forgiveness for this one company. This company has to do this for 20 years, but also they didn't even have a data retention schedule.

Heidi Saas:

And what is your deletion mechanism? Well, we don't know. And then that's a problem, because if you told people that you are getting rid of your data after you no longer need it, then you need to be able to demonstrate that you can do that. There needs to be validity and validation, so the buttons for people to give your consent need to be as easy to find as the ones to withdraw your consent. That's something where they're borrowing from our friends in the EU as well. A lot of these systems work in a multinational environment, and so you've held out for a long time just having the American Wild West version. It's going to be closer to time where you need to afford most people technologically more.

Debra J Farber:

GDPR rights. Absolutely, I totally agree this case. We're definitely going to need privacy engineers to implement technical measures, right that attorneys and analysts and consult privacy consultants. They can't do with paper.

Heidi Saas:

Yeah, they're required to bring in people. You have to go hire experts, is what they said. You don't know how to figure this out. You got to bring in experts from the outside and you have to be willing to work with them and not claim privilege or confidentiality and keep them away from the parts you don't want them to see.

Debra J Farber:

Oh, my God, that just in my experience that has been such hell where you've been obstructed by the own business so that you can't do that. Complete your mandate, because legal makes everything privileged and confidential Anytime the word privacy comes up.

Debra J Farber:

now we have lawyers and technologists working together on behalf of the people Now we have lawyers and technologists working together on behalf of the people Agreed, and so I think we're going to see, especially as the economy improves and tech jobs start to be on the rise again, I think we're going to start to see more privacy engineers getting hired because of this very risk that companies have right. Those attorneys that are reading FTC consent decrees and a lot of them do do that it's more engineers that aren't as following this as closely right. They're going to want me to bring in those experts, and you could get them different ways. You could hire them full time, but you could also hire part-time contractors, because that's just another method to be able to scale your team. So just consider the possibilities, but, like, you definitely are going to need to bring on privacy engineers to demonstrate this compliance and my dog makes an appearance.

Heidi Saas:

Yes, right. And data scientists ML engineers, data scientists all of these awesome people they exist and I hope to be driving work for a lot of people. I know these cases are starting to make people nervous. I hope that doesn't mean more business for enterprise council, who hasn't really done anything to get people in a better position, because I've been watching the same things that we have been watching all this time. Yet they haven't really done anything to change the basic way that their companies are dealing with consumers. They're just saying that we respect your privacy, but they haven't changed the way they do anything to show that they are respecting people's privacy you mean, like funding the privacy office and its mandate.

Debra J Farber:

Yeah right, Consumers, they're starting to call bull on that, which I'm really glad to see. I'm really glad to see it. We need our privacy counsel. Look, you and I are both attorneys.

Debra J Farber:

But I will say it again, I've said it often on this show I think that the fact that privacy counsel has owned privacy in organizations since the beginning has held back the shifting of left of privacy into earlier and earlier, into before products and services are developed, because they didn't know enough to make the case that this is your problem too, or we could address this earlier and instead it's we'll just hire outside counsel, we'll just bring in an audit team. It's just a little less understanding of the technical needs, right, and where even the operational aspects that need to get done. But the good news is that that is, we're building up the expertise for privacy engineering right now, and I think holdings like this are going to continue to make councils, especially enterprise councils, start engaging with actual engineers, designers, data scientists and strategists like you and me. Yeah, absolutely oh. Now this one's a really interesting one.

Debra J Farber:

I think let's turn to the FTC versus Avast, where we're focusing on independent third-party audits for every product or service that processes sensitive data. This really focuses on browsing data is sensitive data. What should we know about the actions holding? Well, they punked them pretty good.

Heidi Saas:

Hey, we're here to protect your privacy and security and we're selling your information to everybody is actually what they were doing, and the FTC was like we're about to punk you hard because they deserved it. But also this case. I believe they chose the face for this case because it shows a flex on extraterritorial jurisdiction. This is a UK company, the FTC can reach you, and so that is exactly what they did, and I believe that's why they did this to make this particular point but also because it was a slam dunk. Basically, that's what they were doing is that they're selling something and then doing the exact opposite for profit. That's called fraud. But they didn't get shut down. They said you can't do this anymore and we're going to have to audit you, and this is definitely a good example for other people to know that what you say in your representations to the public need to actually factually be true in your tech stack as well.

Debra J Farber:

Otherwise it's a deceptive trade practice which the FTC, under Section 5, article 5, section 5, is able to have authority to go after any company.

Heidi Saas:

They also wanted to make a point here about the browsing information. I also wanted to make a point here about the browsing information. That's sensitive information, and they made a very big point of putting that statement out. Collecting this and using this is presumptively prohibited without consent. The presumptively prohibited practice is the words. Those are the words that Lena Kahn used when she was discussing the recent cases as she opened up the PrivacyCon conference of papers last week. That was an excellent presentation of papers, by the way, but those are the words that she used. Presumptively prohibited practice.

Heidi Saas:

Yeah, you've been doing this this way, but you should not have been, and here's why you should not have been. The potential harms and here is what you have to do, moving forward, if you want to do this. I wanted to pull this out to highlight what audits mean, what are the assessments, because I've been working on drafting audit criteria for algorithmic bias and drift and other issues pursuant to different laws that we have here and in the EU for a couple of years now. When people talk to me about audits, they go oh, that sounds dull as dirt, but it's not, and I wanted to bring this out so that you can have a better idea of. What does it look like for you to have to do one of these audits? Well, the third party is going to need to do it for you because it's independent third party assessment. However, you need to know what that report says when it comes back. Okay, because then you're going to need to know how to remediate your system to get in line with compliance, or if you need to scrap certain things or whatever your issues are. But why pay for a report you don't know how to read?

Heidi Saas:

The audits here require documentation for each product and service when you decide to collect, use, share, disclose or maintain that browsing information. And the documentation they want. They want names. They want the name of the person who decided they want to collect the browsing information. They want that person's name. They want the names of the other people in the group that made the decision.

Heidi Saas:

If it was a group, they want to know what your purpose was for when you decided that. They want to know what data segmentation you have in control to make sure that the information collected is only used for that purpose. So they've got the purpose limitations built in here, as well as a data retention limit set. They did not say how long previously in the location data case, they said five years was too long you can read between the lines on that but a data retention you got to have at least one and then tag your data with that, because you're required to have the technical means for achieving deletion and you've got to be able to demonstrate that. That is where the privacy engineers have longevity for work right there all day. We need people to come in and know how to do this. That is not coming from your legal counsel's office, okay.

Debra J Farber:

Absolutely. That's really fascinating. I think Supercharge expand the need for privacy engineers and, as it is, we're not pumping out enough of them quickly enough, although jobs listed have receded in the recent year, given just the tech cycles of laying people off, but the need is still there. So, whether it's a consultant, a contractor or someone you bring on full time, you're going to need teams of people working on this. I wonder, heidi, this also sounds like maybe a job for not just an outside auditing company, but maybe a job for a DPO's office, the DPO for a.

Debra J Farber:

You know the EU requirement to have a DPO. It says that it doesn't have to be one person. It could be a group of people that have various backgrounds that help support auditing the decisions that have been made around privacy in an organization. I look here and I'm like if the person who is making the decisions on the means and processing of personal data has to list their name now and has to be public, and it's, like you know, going to make them go wait, do I really want my name attached to this product or service that's doing X, y or Z? I think that that's going to be a good bellwether as to whether or not you have an icky like. Is this just an icky product that makes you feel like you're, you know?

Heidi Saas:

Oh, it's going to give rise to pause.

Debra J Farber:

for sure it gives you pause right Like do you want your name actually on this, if you don't?

Heidi Saas:

want your name on it, then what are you doing? Because we used to ask people. If you don't want this happening with your data, then don't do it. But in your professional capacity it's different because of everybody else's demands on what needs to happen for the project. But if your name has to go on it and the regulators need to know Because, remember, these are the assessments that are done and handed into the FTC to say we did our homework, we're totally checking behind the mess we made, we're cleaning it up and these are the people working on it.

Heidi Saas:

So if you're not willing to put your name on what you're doing and tell the government I'm the one that's over here cleaning this up then, ouch, maybe you really need to get into another line of work, or the business that you're working with needs to hire other people who are willing to put their name on it, for whatever reason they have. But that's the accountability thing that they can come back at any moment and say we looked at your audit reports and we think we smell fish and so we're going to come in and check some things out and then we're going to come for your people that said that we were in charge of this.

Debra J Farber:

You think there'll be individual liability eventually.

Heidi Saas:

Everybody can have as many legal problems as Elon Musk with the FTC. If you keep screwing up over 20 years Look at Zuck's problems you can have this many problems and then another 20 years and another 20 years Like, yeah, how many generations of Zucks are we going to have under FTC regulations here? Eventually, one of these things are going to stick and we're going to make some progress on this, but when it comes to what the privacy people need to do, is that the arguments that they have been making to the C-suite are fear and risk mitigation and those sorts of arguments. This needs to be put in terms of economic opportunity. This is our chance to get ahead of the curve. Please, let us invest now in making the changes to the infrastructure that we need to so that we operate in a more human centric way.

Heidi Saas:

Once we have accomplished that and can demonstrate it to the public, to the regulators, to everybody, you're going to be in a better position than everybody else who is just waiting to see what happens. For those early adopters, for those who can see around the curve, for the ones who have the stomach or the appetite for making transformative change, now is the time to get in on doing that, and so, yes, bring in people who can help you with strategy on a global sense, so that you don't make a perfect regulatory system for one country but it's inoperable with everything else and all of your other lines of business. Right, there's some nuance to that, but this, I think, is the best time for opportunity for people to use this guidance that they're getting from these cases to say now we have to look at going about things in a different way. There's no longer an opportunity to put your head in the sand and say we don't really have anything to guide us on the litigation or regulatory landscape, because, yes, you do.

Debra J Farber:

Yeah, you can't use that complexity as cover anymore. And if you could also talk about, like, the operation of third-party software here and how that is going to be affected.

Heidi Saas:

Well, the business is now going to be responsible for everything in your tech stack, including the SDKs, whether you know about them or not. You've got to know when you give the documentation for each product and service in the audit. Remember I said that you've got to give that documentation naming names. That includes any third-party software within your product or service, so it's every product and service plus every SDK that lies beneath the wrapper in your service. You've got to do some extensive auditing all the way through your code to find out who else is doing what in there and, honestly, businesses use too many tools.

Heidi Saas:

This is another opportunity, I think, for privacy engineers to come forward in a trusted position and say why don't we reduce risk by reducing the amount of tools that we're using in the infrastructure and then just try to streamline things this way? Because we have these tools here and they have similar capabilities as these other tools. But HR likes using this one and marketing likes using this one. But if you retrain your people and get them all lined up on using the same suite of tools, you'll have fewer other random startup tools that were shiny one day and somebody decided to go ahead and integrate it, and those are the ones with the SDKs in them. Get rid of that stuff. If you really don't need it around, you can start by cutting off access to those sorts of things. If you were looking over this to review this, just go and quietly cut off access and if people don't complain, they never needed it. That's true.

Debra J Farber:

That can be harsh in some environments, but yeah, that is the age old thing you know. Turn it off and see who screams about it.

Heidi Saas:

Yeah, Well, I mean, they have a way to come and tell you I really need it, and then they can argue with you why. But you and I both know, in dealing with the tools that are in the pipeline, there are just too many tools. There really are too many tools, and now people are starting to see the problem of having all the different tools in it, Because if you're on the hook for each and every one of those tools and you don't even know they're in there, you should probably start taking a look. It's time to do the checkup. It's time to do the. You just turned 45, check up, and you've got all kinds of extra doctors. You've got to go see now too, because you got to that age or something. Right, I am actually 45.

Debra J Farber:

So it's hilarious that you just picked that number. And you're right.

Heidi Saas:

That's the number where, all of a sudden, all of these screenings and everything is Right, yeah, so whatever they got tools for it, yeah, but this is the time where businesses need to go through this kind of a process in dealing with what they have going on in their tech stack If they want to have confidence moving forward, because-jerk reactions every time there's another state privacy law that requires their own little flavor of this or that it's really not going to benefit you in the long run. So you need to build for where you're going, not where you are, and I think that you've got to have privacy engineers that are able to work with the other people on your cross-functional team so that they can understand who do I come to when I have a question that I think involves an ethical issue or something like that, on how I code this. Do you want me to use this library? Do you think it will cause a problem? Those kinds of questions.

Heidi Saas:

They need to have somebody to go to to ask those questions, but they also need somewhere where they can put their opinions to say you know, I know I'm just doing the background work on this sprint, but I see if we do this this way, we may increase the efficiency or this or that, because they see things too, and sometimes you can go to work and just do your job, or sometimes you can go to work and amaze people at your job. I think if you tell them this is a safe space for you to come forward. I know you're new privacy engineers are new. I know you're new, but your opinions matter and your advice is valuable. So speak up and be a part of making this better for everyone, because you've got your skills, but you also have your life training and so that gives everybody a different point of view. So I think this really is such a great opportunity to do so much with technology now instead of hide under the bed.

Debra J Farber:

I agree, you're right, and this gives ammunition for privacy engineers who wanna accomplish more and get more budgets and to get the buy-in of legal get the buy-in of just the C-suite. You're right, it is a huge opportunity and we could get away from the technical debt we have if we tackle some of this upfront change of strategy to be more human-centered.

Heidi Saas:

I did a talk at a banking-heavy industry event recently and, yeah, tech debt is something we really had to talk about. It's something they'd never heard of before. I was like but you've had all these preemptions for so long. That's why you have tech debt, and so you know. I was like you know what? I'll tell you this When's the last time you went to the bank and looked at the screen that the teller is looking at? Have you seen what they're looking at? It looks like DOS. They're hunting and pushing the function keys. That's where they are with technology. They're still using four-digit pin numbers for the ATM. Come on, these are signs of tech debt in our face everywhere. But yeah, there are better things that we can do. We've already known for 20 years better things that we can do, but they haven't had any sort of economic incentive to do it, because it's cheaper to do nothing unless you have to.

Debra J Farber:

Thank you for that. I appreciate it. Let's turn to the next case. It's the people. So the people of California bring in a case against DoorDash or a complaint against DoorDash, and this case deals with CalOPPA, which is a California state law that was passed in 2004. I was in law school at the time. The very law that requires companies that do business in California to have a privacy notice, which they call in the law, privacy policy, just to confuse everyone, even though it's a notice on their website. That link, that bottom we've had way before GDPR the bottom of every website that says privacy policy or California privacy policy 20 years ago. What we're talking about, yeah, 20 years ago. So what happened in this case? Why are we talking about CalAPA today and what happened with DoorDash?

Heidi Saas:

Well, this was another one of those cases where they wanted to make a point. The issue that they have with DoorDash, with their marketing cooperative, is they needed to clear up what's a sale, because everybody's like, well, if I have all these extra friends and we're just sharing stuff, like none of it's a sale. And so that was a giant loophole and the regulators said, we see you. And so here you have this decision that says that the involvement within the marketing co-op was done in exchange for the benefit of advertising to potential new customers. Not even that they got to do the advertising, but they had an opportunity to benefit from potential new customers and that participation in the marketing co-op alone was therefore a sale under the CCPA. There's no going about the loophole anymore with friend of friend kind of thing. Your involvement in this co-op that's conferred a benefit to you, whether you use the benefit or not, is that's conferred a benefit to you. Whether you use the benefit or not is irrelevant. It conferred a benefit. And so that is what they call a sale, so that it means it follows the CCPA, even if it didn't Like when you go to sue somebody, you sue them for every cause of action that you can, because maybe not all of them make it. So they included CalOPA. That one made it all the way to the end. It may have even surprised them, but if they had gotten to a situation where maybe one of the arguments weren't going to succeed, they had multiple lines of attack to go with here, and the CalAPA part was important because 20 years ago they decided you have to have a privacy notice with certain pieces of information in it and in 20 years you still haven't gotten that satisfied. That was also making. The point was just you have no excuse for this. This isn't about a new law, this is an old law that you're not following.

Heidi Saas:

The document said one thing, the technology did another and they didn't do any auditing of their third parties. They said to their third parties now don't you use this data for this and that? And they said okay, and they did whatever they want. But the inferences and the other data had already been moved on, so far removed.

Heidi Saas:

There was no possibility that DoorDash could go back to afford what's called cure under the law, meaning you put them back in a position that they would have been but for the harm that you've caused them and they could not cure this because there's no way they could get downstream where all the data had gone like ink into the water. There's no longer a cure period in California, so that's no longer an issue for other companies. But it's important to note that when they first started this investigation, doordash immediately said we'll stop. We'll stop doing everything that we're doing over here to get in compliance with the law. We hear you, we'll stop doing that. But they got them under this complaint here because they couldn't cure it. So, try as they might, they could not locate all the data once they had released it into the ether.

Debra J Farber:

So what I'm hearing is that this is a data governance problem, that because governance of how the data flowed through the organization was not deployed or wasn't possible, that you couldn't then go find where that data was to delete it. Is that a good summary?

Heidi Saas:

So it wasn't just personal data. It's inferencing on personal data as well. So derivative data which? Becomes personal data but yes, that's right Derivative data Fascinating, all right.

Debra J Farber:

So it's really important to underscore here that you want to make sure that, whatever public notices you have, you have technology that backs up or not necessarily technology you have a technical implementation that you can point to that backs up that, that statement you're making to the public. Otherwise, you could put yourself at risk for enforcement actions or, in some cases, class action lawsuits, which I think we'll be talking about in the next section where we talk about state reports and enforcement sweeps of Colorado and Connecticut. First let's tell us about some clarity around breach notification timelines in Connecticut.

Heidi Saas:

Yeah, so in general the notifications need to go out when legal counsel says this is a breach and you and I both know a lot of homework and evaluation and determination goes into saying this is a breach because you don't want it to be a breach as soon as it's a breach, then you've got a lot of obligations. If it's an incident, you don't really have the obligations, you just need to fix the issue in-house.

Debra J Farber:

And I just want to point out there because I think this will be helpful to people who haven't looked at breach stuff for as much as we have that there's a difference between a breach of a system, which may not have a privacy problem, and what we call a privacy breach, which is the trigger for all of this reporting and set of obligations that you have. But it's just purely a security matter if there's an incident that doesn't involve personal data, that's a different type of a breach of a system versus a breach of the data is kind of the differences there.

Heidi Saas:

Yes, what they're worried about is unauthorized access, whether it be by a hacker or by someone in a different department in the same company who shouldn't be looking through the HR files or whatever. It's unauthorized access where personal information is available for them to exfiltrate, to view, to use in any way that can cause harm. That's what they're trying to protect with these rules about what are breaches, what are incidents and when do you have notice provisions. Now, in this case, when you do have a breach and personal information is leaked and there is potential for harm to the consumers other than the breaching of their data all over the place, then you need to notify Connecticut as soon as you've discovered this issue, not once you've taken your time and done the homework and talked to counsel, and 15 emails later we're finally okay, we've all decided it's a breach. Now we need to tell people, and the reason why they've changed the trigger from determination to discovery is because the amount of time that was wasted in between there, maybe even weeks at a time, would be wasted in there, and all of that time the data is leaked out on the dark web, being bought and sold and bad things are happening and people have no way to know about it and why.

Heidi Saas:

Because council wanted some extra time to decide really if it was a breach or not, and so we're sorry you don't get that extra time. We do know you need to do more investigation here, but you got to tell us right away and then you can go and make your other further determinations. But this notice requirement is as soon as you know there's a problem you need to tell us. It falls in line with other notice requirements. Dfs in New York is 72 hours and a lot of those businesses cryptocurrencies and all businesses working in finance they report much, much sooner as soon as they see something suspicious, because there is a good information sharing network there. So as soon as they see something fishy, everybody else sees something fishy and they shut it down immediately. They don't wait 71 hours and then make the notice to the regulator Like that's not what's happening. It's against the common benefit of having these systems.

Debra J Farber:

Right and what you're describing also it's because they've got these like FSI, stack and all these different ISACs, but it's where the industry groups coming together to share threat information, threat intel, so it's really set up for security.

Debra J Farber:

And then there's even sometimes like government private cooperative, it's coming together to share this information right so that we can tackle the problems or security incidents that turn into privacy.

Debra J Farber:

Incidents is what we care about today on this call, but I think it's a result of the security mechanism and looking for those threats ends up being like great well, I had the privacy piece on to this already robust fusion centers and communication of threats across an industry like financial services, banking specifically. We don't really see that anywhere else. I think there might be some for government services, but I've not really seen anything as robust as the financial services area. So it'll be interesting to see over time how lessons learned from that approach and then having all of your breach-related communication documents and everything all set up before the breach has ever happened, so that you're prepared when something does not if, but when something does then you can certainly hit the ground running much faster in terms of this obligation of reporting to the state when an incident that may or may not be a breach, but certainly affects personal data, has been discovered and not when you've determined.

Heidi Saas:

Give consumers a chance here to do something to like call the credit bureaus and put a freeze on. Give them a chance to do something instead of you try to paper and CYA as fast as you can Right change passwords even. Yeah, exactly, and you've got insurance for this. It's not news when somebody gets breached, because everybody's getting breached in the news all the time. Yeah, you might have a short impact on share price, but you'll move and you'll rebound, depending on how you deal with consumers and your messaging. That's what Melanie was saying about communications is that if you're sketchy about it and you don't give a lot of information, or if you give information and it's wrong in the course of trying to do crisis and incident response, that's gonna increase your liability.

Debra J Farber:

Absolutely. Speaking of that, what about companies that believe that they're exempt from certain privacy laws altogether?

Heidi Saas:

Well, best of luck to you, right, it kind of depends on where you are, what you're doing, what kind of data you're collecting, what you're doing with it. I mean, there are a lot of exemptions in there and businesses do fit into them, but there is also a lot of gymnastics being done to make sure that they can fit into them as well, and all I can say about that is not every profitable business model is guaranteed a right to exist.

Debra J Farber:

Oh, amen to that. I know that one of the slides you had in the past conversation with Melanie, you wrote that the CT DPA, or the Connecticut DPA, received 30 complaints in the first six months, I guess, of its existence, and so one third of those complaints involved entities.

Heidi Saas:

I would be willing to bet the majority of the third that didn't fall in to the enforcement category here because they were exempt. I'd be willing to bet most of those were exempted by Graham Leach.

Debra J Farber:

Graham Leach Lyley. Oh okay, so you think financial services?

Heidi Saas:

I most certainly do. Those are. The largest area of consumer complaints are financial services. And so now they're like oh, now I complain about my bank or the shady insurance company or this banking company or whatever, and I can totally complain about them because of this new law in Connecticut and they had to go. Oh, you know what we're sorry. Yeah, that exemption was bought and paid for a long, long time ago.

Debra J Farber:

Okay, so just a different regulator.

Heidi Saas:

They're basically saying they're exempt from because it's wrong regulation. Exemptions are not there because people drafting legislation are nice and they're trying to hook up their friends Like no, they're negotiated, they're bought and paid for by the people whose interests will be served there. They have to argue why they need the exemption. And for Graham Leach, they say we don't need the administrative burden because it will raise the price of access to loans to people. It will raise the fees we have to charge them. They will have reduced access to credit.

Heidi Saas:

All of these horrible, horrible things are going to happen. If we have to honor people's rights, you've got to keep doing what we're doing exactly this way. So you have to give us an exemption. We can't have any further administrative burden and the justification for that is that we've already got all of these security measures in place. We've got it covered, so we don't need to do this. It would be like putting a hat on a hat. So you're going to cost us money to do something we've already got covered. So don't do it, just give us an exemption. Meanwhile, every other day in the news you see the same institutions with sloppy InfoSec all over the place. All the time. I call bull on the ground, leech exemptions, just because that's how I feel about it.

Debra J Farber:

Well, one of the things I love about you, heidi, is we always know when nobody's ever like I wonder how Heidi feels about something. You know, I think many would say the same thing about me, but I really appreciate that in you as well. We say what we mean, we mean what we say.

Heidi Saas:

I'm dealing with these industries for so long, like I see these problems and they have been these problems for so long. I saw when the data broker industry started I was working on Capitol Hill and then I went off to law school and then the FACTA was passed, which amended the Fair Credit Reporting Act in 2003. And it put in place the reseller exception that created the data broker industry. That enabled all of the crazy things that are happening today. And we're just now getting to the point where other people are starting to pay attention. Like for the last five years or whatever. I feel like I've been the lady in the tinfoil hat, like crying the storm is coming, but now people understand and I was like it's almost like everything I've done in my life has gotten to the point where I can now use all of my skills to the fullest ability.

Heidi Saas:

I've got an undergraduate degree from a lobbying school in international relations and government affairs. I've got a law degree. You know I did the privacy certifications. I've done all the work in AI. Now it's at that point where I'm starting to use all of those different skills for privacy, law, technology, ethics and use it to try to say here's the path forward that we need to take, and it's the first time in the last five years or whatever, where I feel like people are starting to listen, not because, like gosh, that's an interesting idea and it sounds right, because it's always sounded right to them. They're listening because they don't really have an option not to. So I don't know how I feel about that.

Debra J Farber:

I don't mind some forced Heidi sass on me.

Heidi Saas:

I kind of enjoy it a lot so that's where I'm at, like I've had it with trying to be nice with people and trying to get them to trust me. Like I'm over that You're either going to heed my word or regret that you didn't. So that's Same, same, that's.

Debra J Farber:

I mean, this is why I don't work in a company anymore and I work outside. I feel like I affect more change by educating the audience, educating the industry on what is yeah you definitely do.

Debra J Farber:

And then I can in a company that really just wants to get in my way of getting things done there because they just want me in the position but they don't really want me to make any change. Right, like I'm done with that, I don't care if it comes with a high salary. Like I can't even just like sit there and just accept that I will. I want to actually make the change and then I get. You know, they stand in the way of that. So it's been frustrating, but it's also even for us opportunity, opportunity. The winds are changing and we're right here at the beginning of it, not the precipice, but the beginning of that change where you can't just do status quo anymore as a company. You really need to get your personal data processing in order.

Heidi Saas:

Yeah, like Colorado and their enforcement. They told you what they're looking for. Kind of like the French a couple of years ago. Yeah, talk about the sweeps, the sweeps, the enforcement sweeps.

Heidi Saas:

Yeah, the sweeps, or whatever the CO started. Colorado started talking about the sweeps. It reminded me of when the French did the sweeps a couple of years ago the cookie banners Times. A credible threat of enforcement is enough to get businesses to do the right thing, and so what they are looking for here are opt-out mechanisms. And do they work? You've got to have clear disclosures, especially with regarding sensitive data, children's data. And what are you doing in targeted marketing? Because data brokers, analytics and the identity verification services those are under the microscope right now, and for obvious reasons. But that's what the regulators are looking at right now in the different states. Colorado just took the extra step to put it out front and say to you this is what we're looking at. Yeah, that brings us up to the blocking, the merger, which was all about data science. This was not about groceries, this is all about data science.

Debra J Farber:

Yeah, so this is about the Kroger merger with attempted merger with Albertsons. So why are we even discussing a case about M&A and competition law? You know, help us understand the connection between the use of personal data and antitrust.

Heidi Saas:

So if you think about your browsing history and how personal that information is, you know like what sites you go to and what's in your shopping cart and those sorts of things. Now look at your bank book and figure out where do you spend the most of your money. Most of it goes to the grocery store because we have to eat. So you either eat a lot at restaurants and bless you if you can afford that, but a lot of people spend the majority of their budgets for food at the grocery store and what you buy tells a huge story as well as they're monitoring you in the store. You're in there once a week, maybe every other week, something like that, but you're in there pretty often. You have a pretty solid relationship with your supermarket that they don't have somewhere else Like. This is the last bastion of big box that you have to go to because you can order your groceries online, but you probably don't wanna have to do that every single week. If you're driving past the grocery store and need to grab a few things, you're going to swing in there. If you have ever tried to buy anything at the grocery store without your little discount card, it costs a lot more. So they decided a while ago that your information was worth something. If you get the little discount card, give them your email address or your phone number. Then you get the discounts in the store, but they get to track all the information across all of their grocery stores.

Heidi Saas:

So this one may say Kroger, but it's also this, that and the other different grocery store line that's all owned by Kroger because it was number one company and number two company in food sales and they wanted to merge. Now, in that instance, the FTC. You got to ask permission because that's too much market control right there and the executives themselves admitted that, yes, this would be a huge monopoly if we do this. The reason why they're stopping this is because it's anti-competitive. You need to have more grocery stores to compete with each other so that you have more options to buy the food and so the farmers have more places to sell the food. In addition to, the workers would be harmed by fewer places to work and consumers would have fewer choices. So you're looking at harm to economy as a whole, harm to workers and harm to consumers, and those are the reasons why this merger was deemed anti-competitive and bad for our overall food economy and they put a stop to it.

Debra J Farber:

Makes sense, and so why is this important to privacy?

Heidi Saas:

They have all of our data. They know everything about us. If you want them off of your trail and you don't want them to know where you are as a woman, you need to buy tampons, diapers and Depends every week, otherwise they're going to know. They're also going to know if you've got thinning hair. They're going to know if you buy arthritis cream. They're going to know if you buy dog food or cat food. All of these different things say something about who you are. They're going to know if you're always eating processed food or if you spend your time in the outer ring buying just raw ingredients, and they're going to put you into marketing segments based on your behavior inside the store, in addition to the food that you buy. So if you look at everything in your pantry and ask, what does this can of beans say about me, it says something to the grocery store. It's just the digital exhaust, so to speak, on the web that you're not thinking so much about.

Heidi Saas:

That is what they're doing. It's data science, and they've had a very strong data science game for a long time. The Markup did a report on this 18 months ago, I think and it was mind-blowing how much money they have invested in this business. I believe Kroger's data science company is making an obscene amount of money, maybe even rivaling what they're selling in food. So, yes, because they don't just collect this data, they're using this data to sell to everybody else that wants to know who has arthritis and likes to eat beans and has a cat, because they would be perfect for my advertising list for this new cat condo I'm selling, or something right Like. That's why they're collecting all this data about you, and this says so much about what we're doing. Your transactions say everything about you as a consumer.

Debra J Farber:

Right. Well, that really sums up, I think, the six cases we were going over today.

Heidi Saas:

That was exhausting.

Debra J Farber:

Absolutely, and I expect a lot more coming too right. I mean, the FTC is just warming up.

Heidi Saas:

Well, they have two new commissioners. It's been three Democrats and so there's been some scuttlebutt about that, but that is how the makeup is. There've been two and they finally got confirmed. So there are two Republican commissioners joining now and they are getting set up in the office and getting to know everybody. And I'm excited to see because privacy is a bipartisan issue it is, so I am excited to see, once they get settled in, what projects they find that are already in progress that they want to start working with or promote or do something with.

Heidi Saas:

It'll be interesting to see once all five commissioners are there at the FTC. It's not that big a spot. It's like the size of a law firm in a big city, like 1,000, 1,200, maybe 1,500 people. It's not that big a spot. It's like the size of a law firm in a big city, like 1,000, 1,200, maybe 1,500 people. It's not that big. They've got personalities and goals and agendas and those sorts of things. I do anticipate seeing more stuff coming from them, but I also know this first big sprint was to make a good statement and hopefully they'll keep the ball rolling. But right now I think it's beneficial for the community at large to take a moment and calm down and then decide what does this mean for what I'm doing? And then, like we've been saying, take whatever opportunity you see to make improvements going forward, because this is not gonna stop. But we need a few more minutes to marinate in what this means now before we start making any more decisions, right?

Debra J Farber:

Absolutely, absolutely, in fact. Are there any resources you recommend to our listeners that will help them keep up to date with future FTC enforcement actions?

Heidi Saas:

You know what LinkedIn is, where I have my collection of smart people and you know I write. I've gone after people that do research and create content and those sorts of things, so I have trusted sources of information for me. Like you guys create my newsfeed so, you know, start working in the privacy community and sharing with other people that create content, because as these things come up, then you know they'll share their opinions on it and post other things to help give you guidance, to find out. You know what is everybody thinking about these issues and yeah, there are some law firms and privacy companies that put out newsletters and things like that to let you know. Here's what this means.

Heidi Saas:

But I think this is one of those things where, if you have a mentor, you can discuss these things with your mentor. But also this is an opportunity where you have to network with other people and seek mentors in new areas where you maybe have not thought about finding a mentor, because your mentor doesn't have to do exactly what you do to teach you the craft like a Jedi. You can have a mentor that works in an ancillary field, like they work in cyber and you work in privacy. They can be a mentor as well. You can have a new decision from a regulator and you can discuss it what it means to you from your engineering perspective and privacy, and then you can talk to somebody in cybersecurity and say what does this mean to you? Because we're all working on the same system, right? So I think this is another great opportunity to invest in each other and learning as we all learn through this together. Is there a book to read? No, because by the time you get a book published, it's already out of date and has bad information.

Debra J Farber:

Right, I mean, I've been thinking about writing one myself and then I'm like, but it's a never ending, it's going to be constantly telling people I don't want to write a book for this reason. Right, I don't want to have to write a book, I write enough.

Heidi Saas:

I think on LinkedIn share enough of my information, but I also have trusted source of information that I like to banter with. Like when these decisions came down, my signal was blowing up. I was on fire for days and days because we're having these hash out conversations with people like well, what does this mean? Well, did you see that and what does that mean? And those sort of behind the scenes conversations or whatever you know it's. That's how we figured out, this is what this means to us, and then we can go out together and advise the businesses and say here's what this needs to mean to you so that you can do something differently.

Heidi Saas:

Or you can document that I was here and told you what to do and you didn't do it, and that's called prior notice. Right, right, absolutely. That's a problem If people aren't going to bring in a consultant and say you need to do this, you need to address your issues, otherwise you've got a report sitting around waiting to get you in trouble. I've heard enterprise counsel discourage businesses from bringing them in to do audits for algorithmic bias or pipeline reviews for their data for this reason, because they don't want to create prior notice document. I'm thinking that could not be more behind the curve than anything else I've ever heard.

Debra J Farber:

Like that's ludicrous problem that's going to face them in the future, right when their business has been ruled illegal or maybe they try to. Disgorgement is going to make them get rid of the model that they trained on, and we'll have to do it.

Heidi Saas:

Precisely. The disgorgement is the new favorite remedy. I love it too, but businesses need to think like, if you're getting this kind of advice and you're not really feeling confident about that advice, ask yourself how many lawyers do you see going to jail for giving bad business advice? Not the ones that work for Trump, but other than that. How many lawyers do you see going to jail for giving bad business advice? I don't really see many, do you?

Debra J Farber:

I mean, I can't even think of, and you know they're out there doing it all day, yeah, so businesses need to think about that.

Heidi Saas:

If that's the advice you're getting, you don't feel like you're being heard, that you absolutely should get advice from someone else and see if you feel comfortable in making a decision after you've gotten more than just one opinion on the matter, because you can't know everything in this field. We all have to work together on this. That should be suspect to businesses, I think, is that why should I not go through and find out what all is in my tech stack? Because I not knowing is going to be a better position? Like it's not.

Debra J Farber:

No, there's no other area that I can think of that does risk analysis and addresses risk that way.

Heidi Saas:

Just buy more insurance. They said yeah, you know what? We've mitigated the risk by buying some more insurance and we've squeezed this language into the terms of use. It's cool.

Debra J Farber:

Yeah, but insurance companies want you to demonstrate to them, before they give you insurance these days, that you have controls in place for privacy.

Heidi Saas:

I'm psyched to see insurance companies with technologists because they've had data scientists for a while, but they're bringing in technologists so that they can look more into what is going on on the other side, because they've been working on technology in-house for obvious reasons, but they're starting to look at technology outside and be like you know what? I looked at what you have in your tech stack and we're not going to insure you because of these reasons. Whether you know about them or not, that shouldn't have to be the insurance company's job, but I'm excited to see some of those smart people.

Debra J Farber:

They're mitigating their risk.

Heidi Saas:

Yeah, I'm starting to see some of those smart people go work for the insurance companies and I'm not really all that mad at them for it. I get it, but yeah, like somebody has got to be able to do that. It's just. It's encouraging to me to see the new fields that are open for people that have engineering skills. You know there is privacy, like you don't have to just go and build video games, which can be fun, but that doesn't have to be the only future that you see for yourself. If you know how to code, there are so many other things you can do, like working in privacy and engineering. In particular, I feel like you have a role in working in civil rights, because all of these systems were built to process human data without any respect for human dignity, and so, where the culpability of the law meets, the code that's causing the harm. We've got those places to make changes now, and we need the privacy engineers to be there to help us make the right changes so that we don't make the system worse.

Debra J Farber:

Absolutely. What's the best way for folks to reach out to you?

Heidi Saas:

I only have one social media thing because I just I don't have time to do anything else. But yeah, I'm on LinkedIn and I love meeting with people and I think that that's great. But this is not legal advice and if you have legal questions you can still ask me and if I'm not licensed in your jurisdiction, I may be able to refer you to another attorney who is in your jurisdiction who can help you with your particular issue. I am licensed in New York, connecticut, maryland and any of the states by reciprocity for those three.

Debra J Farber:

Awesome. So DC is one of the two.

Heidi Saas:

Yeah, I think that, well, I had to take three exams because none of those were by reciprocity.

Heidi Saas:

So I took three bar exams to do this, but it opened up, I think, like 36 different states that I can through reciprocity if I need to. Yeah well, I'm not a litigator, so I mean it's not that important that I need to be able to run into every courtroom, because sometimes it's legal advice and sometimes it's just consulting advice on privacy and strategy and things like that. So I'm on LinkedIn, that's my corner, that's where I'm going to be at.

Debra J Farber:

Excellent, and so, before we close, do you have any additional insights you'd like to leave the audience with? No, that was a lot, yeah it was just a whole.

Heidi Saas:

You know what I mean. We talked a lot about the FTC today. We can talk another time about the CFPB.

Debra J Farber:

That would be great. No-transcript. There are many regulators for privacy in the United States, especially at the federal level. Yeah, let's revisit that and maybe unpack some of those in the next episode. Heidi, thank you so much for joining us today on the Shifting Privacy Left podcast.

Heidi Saas:

Thanks for having me. I love what you're doing with this podcast really. Yeah, I mean it's fun.

Debra J Farber:

I'm having great conversations like this. Until next Tuesday, everyone, when we'll be back with engaging content and another great guest or guests. Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website, shiftingprivacyleftcom, where you can subscribe to updates so you'll never miss a show While you're at it. If you found this episode valuable, go ahead and share it with a friend, and if you're an engineer who cares passionately about privacy, check out Privato, the developer-friendly privacy platform and sponsor of this show. To learn more, go to privatoai. Be sure to tune in next Tuesday for a new episode. Bye for now.

Podcasts we love