The Shifting Privacy Left Podcast

S2E20: Location Privacy, Data Brokers & Privacy Datasets with Jeff Jockisch

July 05, 2023 Debra J Farber / Jeff Jockisch Season 2 Episode 20
S2E20: Location Privacy, Data Brokers & Privacy Datasets with Jeff Jockisch
The Shifting Privacy Left Podcast
More Info
The Shifting Privacy Left Podcast
S2E20: Location Privacy, Data Brokers & Privacy Datasets with Jeff Jockisch
Jul 05, 2023 Season 2 Episode 20
Debra J Farber / Jeff Jockisch

This week’s guest is Jeff Jockisch, Partner at Avantis Privacy and co-host of the weekly LinkedIn Live event, Your Bytes = Your Rights, a town hall-style discussion around ownership, digital rights, and privacy. Jeff is currently a data privacy researcher at PrivacyPlan, where he focuses specifically on privacy data sets. 

In this conversation, we delve into current risks to location privacy; how precise location data really is; how humans can have more control over their data; and what organizations can do to protect humans’ data privacy. 

For access to a dataset of data resources and privacy podcasts, check out Jeff’s robust database — the Shifting Privacy Left podcast was recently added.


Topics Covered:

  • Jeff’s approach to creating privacy data sets and what “gaining insight into the privacy landscape” means.
  • How law enforcement can be a threat actor to someone’s privacy, using the example of Texas' abortion law
  • Whether data brokers are getting exact location information or are inferring someone’s location.
  • Why geolocation brokers had not considered themselves data brokers.
  • Why anonymization is insufficient for location privacy. 
  • How 'consent theater' coupled with location leakage is an existential threat to our privacy.
  • How people can protect themselves from having data collected and sold by data and location brokers.
  • Why apps permissions should be more specific when notifying users about personal data collection and use. 
  • How Apple and Android devices treat Mobile Ad ID (MAID) differently and how that affects your historical location data.
  • How companies can protect data by using broader geolocation information instead of precise geolocation information. 
  • More information about Jeff's LinkedIn Live show, Your Bytes = Your Rights.


Resources Mentioned:

Guest Info:

Send us a Text Message.



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Show Notes Transcript

This week’s guest is Jeff Jockisch, Partner at Avantis Privacy and co-host of the weekly LinkedIn Live event, Your Bytes = Your Rights, a town hall-style discussion around ownership, digital rights, and privacy. Jeff is currently a data privacy researcher at PrivacyPlan, where he focuses specifically on privacy data sets. 

In this conversation, we delve into current risks to location privacy; how precise location data really is; how humans can have more control over their data; and what organizations can do to protect humans’ data privacy. 

For access to a dataset of data resources and privacy podcasts, check out Jeff’s robust database — the Shifting Privacy Left podcast was recently added.


Topics Covered:

  • Jeff’s approach to creating privacy data sets and what “gaining insight into the privacy landscape” means.
  • How law enforcement can be a threat actor to someone’s privacy, using the example of Texas' abortion law
  • Whether data brokers are getting exact location information or are inferring someone’s location.
  • Why geolocation brokers had not considered themselves data brokers.
  • Why anonymization is insufficient for location privacy. 
  • How 'consent theater' coupled with location leakage is an existential threat to our privacy.
  • How people can protect themselves from having data collected and sold by data and location brokers.
  • Why apps permissions should be more specific when notifying users about personal data collection and use. 
  • How Apple and Android devices treat Mobile Ad ID (MAID) differently and how that affects your historical location data.
  • How companies can protect data by using broader geolocation information instead of precise geolocation information. 
  • More information about Jeff's LinkedIn Live show, Your Bytes = Your Rights.


Resources Mentioned:

Guest Info:

Send us a Text Message.



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Debra Farber:

Hello, I am Debra J. Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans and to prevent dystopia. Each week, we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models and ecosystems. Today I'm delighted to welcome my next guest, Jeff Jockisch, Partner at Avantis Privacy, Data Privacy Researcher at Privacy Plan, and co-host of the weekly LinkedIn event - or, yeah, I guess we call it a podcast as well - called "Your Bites, your Rights, which focuses on town hall style discussions around ownership, digital rights and privacy. Welcome, jeff.

Jeff Jockisch:

Thanks, Debra. Great to be here. Wonderful to chat.

Debra Farber:

Absolutely. It's been too long since we've caught up, so why not do it publicly on this radio show right, this podcast? I know you did not start out working in privacy. And, at Privacy Plan, where you've been for a while, you've been focusing on data sets - like, specifically, 'privacy data sets,' privacy consulting and privacy training. And, your website states that you "research and create data sets about data privacy to gain insight into the privacy landscape. Can you give us some context as to what you mean by 'privacy data sets,' maybe your approach to creating them, and what you mean by gaining insight into the privacy landscape?

Jeff Jockisch:

Yeah, absolutely. I guess that's actually a few questions.

Debra Farber:

It kind of is. I'm sorry.

Jeff Jockisch:

Well, I guess it sort of gets to the core of who I am. You know, I sort of grew up in technology and marketing technology, sort of the other side of the equation from privacy. I actually worked in a search engine for a long time before I got into privacy, and that was my introduction to it, doing some work in CAN-SPAM and COPPA as part of an SMS-based search engine. I won't really go into that because it's a little bit deeper, but I really built a lot of data sets when I was doing that, essentially building out a knowledge graph on the back end of a search engine. That's really where I cut my teeth on data science and building data sets. I really loved doing that.

Jeff Jockisch:

So, when I got into privacy, I really loved the privacy world, but I didn't really want to do what everybody else in privacy did.

Jeff Jockisch:

I didn't really want to work on compliance.

Jeff Jockisch:

I realized as I was studying for my CIPP/ US that the way I was studying for it was building data sets of privacy laws to study them and building data sets of privacy podcasts that I was listening to, to learn.

Jeff Jockisch:

It occurred to me that I like to build structured datasets and then I could do that in the privacy world. I just really fixated on that, and I started building massive datasets around different things. I've got a huge dataset of privacy laws, like all the privacy laws in the United States, and not just a list of them, all the different attributes and things like that. Probably there are law firms and organizations that do that. But I built a lot of other things like datasets of privacy podcasts, datasets of all the different aspects of privacy regulators across the world. I think I've got maybe one of the largest ones of that and datasets of privacy brokers and all aspects of data breach laws in the United States and all the different things that happen there. When we start analyzing those datasets and looking at all the different attributes, you really learn a lot and that's where you get those insights.

Debra Farber:

That makes a lot of sense and it definitely explains - you're a "data guy." That's how you got into doing privacy datasets and it shows that people get into privacy from all different angles, whether it's being a developer and coming in and starting to be a privacy engineer writing code, or privacy architects coming over from other spaces to work on privacy, or compliance folks, GRC folks, lawyers. You are absolutely the first person that I have ever met that has worked on datasets, which really drew me to you a few years back when I saw the work that you were making public. Now I know you recently started working for a company called Avantis Privacy to work on location privacy and deletion; and, I think we're going to center some of our conversation today mostly around 'location privacy.' I'd love to hear you elaborate on the work that you're doing there.

Jeff Jockisch:

Sure, that was actually a little bit of an outgrowth of what I was doing at Privacy Plan, so sort of still doing it at Privacy Plan. As part of my research at Privacy Plan, I built a dataset of data brokers. I found that nobody in the world, that appears to me, was really tracking them. There are actually a few organizations and, frankly, there are a couple of state laws that require data brokers to register - one in California, one in Vermont, and a couple other states are trying to pass some of those laws now; but, only about a little less than 1,000, probably close to around 800 data brokers, have actually registered. But we know, way back in 2014, the FTC estimated there were about 4,000 data brokers probably a lot more now. We've actually got a database of 3,000, actually 3,200 data brokers right now.

Debra Farber:

Wow.

Jeff Jockisch:

Well, painstakingly, I built that dataset over three years now and from a variety of different sources, starting there with that 800 and building it out in a variety of different ways. Part of that's proprietary methodology, but a lot of it is just scouring the internet for datasets where somebody says this is a data broker, that's a data broker, and combining every dataset of data brokers I can find and growing it from there. Then a lot of people saw the lists and say, "ey, I'm a data broker and these are my data sources. If you just keep adding all of that up painstakingly over time, you end up with a lot. Sometimes I go into individual markets and say, "okay, well, who are the healthcare data brokers? Who are the political data brokers? Who are the location data brokers? and have to individually, in different market segments, try to find who are the brokers. It's just a lot of work.

Debra Farber:

Absolutely sounds like a lot of work. That's an astounding number. I'm not surprised that there's that many data brokers, but I am in awe of the fact that you've been able to identify and kind of tag that many. So, with Advantis Privacy, you've been working on helping to delete geolocation data. What is your approach there?

Jeff Jockisch:

Yeah, so what happened was, as I was building this data broker dataset, one of the guys from Avantis - one of the original two founders there - approached me and wanted to chat. I actually did some consulting for them. I ended up coming on board with that organization. Now I've sort of became a full partner there.

Jeff Jockisch:

Avantus' approach originally was to actually be a data deletion service sort of like Incogni, Optery, PrivacyBee, DeleteM e. One of the things we realized early on is that nobody was deleting location data. Part of that is because the location data brokers, until the FTC sued Kachava last year, really were not letting people delete their location data. It was not really an option. They all claimed that that data was anonymous and therefore not personal data and not necessarily deleteable. They very quickly started putting changes into their privacy policy and throwing up delete pages where you could enter your MAID, maybe email address, and delete that information once the FTC sued Kachava. Now it's possible to delete your location data, but people don't know who those brokers are. They don't know how to find their MAID, their mobile ID number. They didn't even know it's a thing.

Debra Farber:

Before this conversation - I've been in privacy; I talk about location data - I didn't even know, but I'm not an expert in brokers, I didn't even know that location brokers were kind of a separate category from a data broker. They are. Break down the difference there, because that is definitely something I think people would be interested in learning more about.

Jeff Jockisch:

Yeah, we've had 150 location data brokers defined that are storing your mobile advertising ID, as well as series of lat-longs - places where you've been: your home, your office, your shopping locations, your church, your gym, every place you've been and all the routes you used to get there. They've got all this information and they're selling it frankly to just about anybody with a credit card. It's not necessarily quite that easy to buy it, but it's not hard either.

Debra Farber:

Yeah, let's explore that. What are some of the current risks to our location privacy? I know there's a lot to expound on here. I'll let you answer first and maybe we'll go deeper.

Jeff Jockisch:

Well, there are a lot of risks to your location and a lot of ways that your location leaks.

Jeff Jockisch:

We're actually trying to put together a little bit of a categorization system of what those different location risks are and where your location ends up. Frankly, your MAID is really only one of those vectors, but it's a very large one and one that people don't really understand. That's one of the reasons we're focusing on it, because you can delete your profile information from data brokers and that'll get rid of things like how easily your address is exposed to people search websites. That's something that the other deletion services can do reasonably well (maybe not perfectly), but decently well. But, they're not getting at this other MAID information. Then there are other things where you're exposing your location information in a variety of other ways too, like social media posts. your car, when you drive it around, exposes your information, sometimes through your mobile ID information, also through automated license-plate readers and through some other things, like other ways through your car. Plus, your phone can also leak some data through cell tower information in a variety of other ways, though they're a little bit harder for non-law enforcement organizations to get at.

Debra Farber:

What are the restrictions there? Actually, before I get to the restrictions of why it's hard for law enforcement to get at some of this data, I did want to expound upon the risks to our location privacy. What are some of the privacy harms that could occur? Obviously, there's tracking.

Jeff Jockisch:

Sure, actually, I was talking about the types of data rather than the risks.

Debra Farber:

Yeah, that's fine, you could switch gears.

Jeff Jockisch:

Yeah. Well, the risks are, it depends upon who you are. If people have your location, if you're an at-risk target, they can do a lot. First of all, it's a physical safety risk. If people know where they are, they can come and get you. If you have enemies, if you have somebody who's pissed off at you because you said something on social media, they can find you. They can come after you. They can SWAT you.

Jeff Jockisch:

We know this is a growing problem. They can send people after you, which is bad. If you've got a stalker, they can come to your house. They can find out where you work and go there. They can intercept you on your way places. That's a really huge problem for people that are celebrities or athletes or CEOs or people that have got stalkers, domestic abuse victims, things like that certain types of people. Even if you just are a regular person, you really don't want your location to be out there, because identity thieves can then use that information to commit crimes in your name. If they know where you live and they know lots of things about you, it's much easier to impersonate you. Those are a couple of the really biggest risks. I guess that's where I'm sort of leave it there. There are other ones as well.

Debra Farber:

So this might be one you're thinking and holding back, but I do want to bring up the risk of, in a post-Roe v. Wade world now where in a state like Texas, with Roe being overturned and with anybody being able to pretty much accuse someone of having an abortion or seeking an abortion which is now against the law in Texas governments can use law enforcement or governments can obtain this data to find someone and prosecute them, or to check and see, maybe through a health app or health data set that they get their hands on can tell that you've been near, or to, an abortion clinic or that you went across state lines to an abortion clinic or something along those lines. Right? Yep. So, law enforcement can even be a threat actor here to one's privacy, which is kind of scary.

Jeff Jockisch:

Yeah, law enforcement, or just your neighbor who doesn't like you, right? It's pretty scary And, to be clear, it's not just about women that have abortion in those circumstances, right, if you get pregnant, there are just as many circumstances where you don't actually carry the baby to birth, where it's not an abortion, where it's potentially miscarriage or some other circumstance, where it's not an actual abortion, but somebody else who's looking at that from the outside might think it was an abortion when it wasn't. Right? And, they can accuse you of having an abortion. Right? And put you through all kinds of circumstance and stress based upon some of these new laws that are put in place. Right? And, that's pretty horrendous.

Debra Farber:

I agree. I think there's definitely some fundamental harms that can happen as a result of getting access to that geolocation data. So, I guess to that point, I wanted to understand from you whether or not these location data brokers are always getting direct, precise information, or are they also inferring someone's location based on behavior or some other elements?

Jeff Jockisch:

So most of the data that we're looking at is actual location data that is based upon your GPS.

Jeff Jockisch:

Some of it is. It can be Bluetooth, some of it can be Wi-Fi, some of it can be cellular, but generally it's translated and then connected to your mobile advertising ID number. They can do inferences and stuff like that as well, and a lot of them do, but oftentimes those inferences, I think, are more like trying to impute whether or not, based upon your path and stops, that you were at a particular place or at another particular place, right? I don't know if you've ever been in Google, but sometimes you drive by and you park in a parking lot and it doesn't know if you went to the grocery store or you went to the Dunkin Donuts right next to it. That kind of inferences are things that these folks might make. Right, did you go there? or did you go to the mosque that was in the corner of the shopping mall, right? So if you go to a mall that has a mosque in the corner, they might all of a sudden think that you're going there instead of going to the Dunkin Donuts next to it.

Debra Farber:

Yeah, that makes a lot of sense. And I could see how that can be misused as well. You previously said that the geolocation brokers in the past didn't really consider themselves data brokers, and I think you hinted that that's because they had anonymized data.

Jeff Jockisch:

Yes.

Debra Farber:

And so can you speak to that. W hy is anonymization not enough? Why does that not take them out of being covered by data protection laws or privacy laws or by the ruling, the Kachava ruling? Just speak to us about geolocation and anonymization and some of the challenges.

Jeff Jockisch:

Sure, sure, Anonymization has always been a way to essentially avoid privacy laws, And the problem is that for a while now we have known that anonymized data can often be deanonymized, And location data is particularly vulnerable to this kind of re-identification. If you take three precise geolocations - especially if it's got a time element attached to it, which all of this stuff does - take three of those data points, or even four, you can identify Americans with 95% to 97% accuracy. So, if I have four data points on you, just randomly from some set of location data, Debra, I can identify you 90%, let's say 95% of the time that it's exactly you.

Debra Farber:

Yeah, that's crazy.

Jeff Jockisch:

Because probably one of those data points is your house, one's your office, and one's you're on the way to work from the highway And I can tell just from looking at those three or four points of data that it's you because it can't be anybody else.

Debra Farber:

Right, because it's almost like a behavioral pattern that's being recognized.

Jeff Jockisch:

Yeah, that's what it is. Right? And this is proven, statistically proven. There's a couple of articles There's one from two years ago from Nature Magazine that proves this right, and so it's impossible for these people to say that it can't be re-identified. Got it. But this is actually, we've known this for a long time. I think it was back in 2011, 2012,. Lantana Sweeney from - I forget where she's from, maybe MIT - has done this research, so we've known about it for a decade.

Debra Farber:

Right and yet, because there weren't any regulations saying otherwise, so many ad brokers' responses was just to anonymize. And I know you wrote about privacy theater. So on LinkedIn I recently saw your post that reads quote location leakage coupled with 'consent theater' is an existential threat to our privacy. For many it's a threat to their physical safety and well-being." Can you elaborate on this existential threat and what you mean by consent theater? I mean, I have my ideas, but I obviously want to hear from you.

Jeff Jockisch:

Yeah, well, the way this information gets collected is primarily from your smartphone, from apps. Right. You are going to go and load up your smartphone and you're going to turn on a weather app, right? Because you want weather. You need to know if it's going to rain tomorrow or if it's going to rain in the next 15 minutes. I've got a weather app that I use that tells me, when I look at it, if I need to bring out the umbrella in the next 15 minutes, and it's highly accurate. It's got great know-it-date on it. Right? But, it also sends my location to a data broker, and that location is connected to a mobile advertising ID, and it's going to get sold to Venntel and Kachava and Gravy Analytics and a whole bunch of other folks like that. Right? And what people don't realize, though, is that that's happening. That is because they put this weather application on their phone and said, "Yeah, i want this application, that they're essentially agreeing to give their location information to that application; and that application is going to monetize it by selling it to these location data brokers. They don't even know that they're there, that this ecosystem exists or that information is going to get sold, and then that information is going to go to anybody that wants to buy it, including law enforcement, Homeland Security, the FBI. Anybody can essentially get that data. Right? And when the government's getting it, they're essentially circumventing the Fourth Amendment because they can get it without a warrant.

Jeff Jockisch:

Identity thieves can find easy ways to essentially buy that information. If I've got a stalker, that stalker can buy - they can't necessarily buy just my data. They can't walk in to a data broker and say give me Jeff Jokisch's location. But, if they happen to know what state I'm in, for instance, they could probably buy a swath of data in my state or my city, something like that, and then, based upon that blob of data they got, they could probably figure out who I am. If they know a little bit about me and be able to figure out more about me, you know what I'm saying. So if they knew, for instance, where I worked and my name, they could probably figure out who I am. That makes sense, right?

Debra Farber:

Yeah, no, that definitely does make sense. And then getting all the consents, the check- the- box that everyone gives to the privacy policy, and just to move forward with installing an app on their phone or on a website, gaining access to or creating an account. I'm assuming that that's the consent theater part where we can't possibly manage all of these relationships and understand and remember everyone we've consented to using our data for what purposes.

Jeff Jockisch:

Right. For me, the consent theater is this: they don't mind giving my location to the weather app right; but, what they're not telling me when I say "OK to that, is what happens to my location information downstream. It's completely non-obvious to me or to anybody else that that location information is going to be sold and is going to end up with Homeland Security, with potentially a threat actor, with potentially an identity thief. How is that possible?

Debra Farber:

Yeah, i mean it shouldn't be. I hate data brokers, personally. I know that's something that Heidi Sass, who's a member of the Avantis team as well, she's always saying, "I hate data brokers. I'm not afraid to say it here. For the most part, they're trading on unsuspecting people who may have technically given consent but not really knowing what they're consenting to and using their data, sometimes selling it to their detriment of those people. How can people protect themselves from having their data collected and sold by data brokers and location brokers?

Jeff Jockisch:

Sure, Well, I mean, there's really two phases to this. Right? I mean we can help delete the stuff that you've already leaked. Avantis can help you delete that historical information, but you've also got to stop leaking the information. You've got to stop giving consent to these apps, and that can be a harder thing. We're also working on something that may be able to help with that, but that's for a later time in a later show.

Debra Farber:

Oh, excellent, because I was going to ask whether you think it's incumbent upon the phone manufacturers who have the, whether it's firmware or software, whatever the pop-ups that are enabled to tell you what a particular app is going to use your data for, I find that those topics 'use it for marketing,' 'use it for customization.' I find that the reasons that developers can select are very broad so as not to really tell you the real purpose behind the data use downstream, but more of a category that can be selected just so that they can move forward and start the data collection. I feel like it needs to be more granular and or allow apps to be able to notify individuals about a more granular use of personal data.

Jeff Jockisch:

Yeah, I definitely agree. I mean it needs to be more granular. It would be great if there was something that said "ey, we're going to use your data, but we're never going to sell it. Right? Too. So, like I'm going to use it for my marketing, but I'm not going to sell it to anybody who can sell it onward right. That would be awesome if I could give my location to my weather app without having to give it to the whole world like that checkbox.

Debra Farber:

Yeah, wouldn't we all? It seems like that should be the way things are. Right? Optimizing for the benefit of humans and not necessarily for corporations to exploit them for the purposes of just making money off their data that they didn't consent to. So I'm with you on that.

Jeff Jockisch:

Yeah, I actually think, Debra, that some of this is changing. I think some of this consent theater what I've referred to it as is going to change a bit. I think the consent boxes are probably going to change a bit in the next couple of years, and there are companies like Apple that are taking somewhat of a leadership role there. In terms of MAIDs, because of the ATT framework that Apple rolled out, there are a lot of people that have opted out of that. So there are less people on the Apple operating systems - iOS - that have their mobile advertising IDs turned on now than on Android, for instance. That's good. Right?

Jeff Jockisch:

However, a lot of those people still have a lot of historical location data out there. So, even if you're running an iPhone and you're like, "h yeah, well, I turned off my mobile advertising ID Well, that's awesome right. Except, even if you turned it off, you know, like six months ago, all the data that you let those companies collect for years is still sitting out there. Now, some of it might be older and maybe less valuable, but if you still live in the same house, go to the same office, go to the same routes, all that data is still sitting out there for somebody to use against you.

Debra Farber:

So, that's a really great point. All it does is stop the data collection, but not necessarily purge it. So, how could individuals gain more control over their own data, especially if we don't have a unified at least in the United States, we don't have a equivalent of a GDPR And we just have some states, like California's DCPA and a few other states that have kind of followed suit with the privacy law.

Jeff Jockisch:

Well, like this, Avantis can help you try to purge that data with our location purge, because with Apple, you can actually go back in, turn your MAID on, grab that MAID number, send it to us, and we can purge that information for you. It's actually harder on Android. So, on the Android devices, only 3% of people roughly have turned off their mobile advertising ID, which is sad. However, there's a problem on Android, and that's this: if you have turned off your advertising ID, there's no way to know what it used to be. So, all of that historical information is now no longer reachable. So, you can't turn your Android ID back on, figure out what it was and then send that MAID to us so that we can delete that past history unless you can figure out some other way to figure out what your ID was. It's like unreachable historical data. You know what I'm saying?

Debra Farber:

Yeah, well, that's definitely, security-wise, that's a data availability problem now, potentially. I mean, arguably, could that be a good thing, like it's no longer identifiable potentially, or is it still going to get bought and sold?

Jeff Jockisch:

Still going to get bought and sold, and somebody, anybody who grabs more than three or four points of data can re-identify it and figure out it's you.

Debra Farber:

Right. So if somebody deletes their ID off of Android, are you basically saying that Google will still track that ID even though that person is no longer - it's not considered "Let's purge that data. It's not good anymore.

Jeff Jockisch:

It's nothing. Google's tracking you, right? Because at that point, Google's no longer sending new information based upon any mobile advertising ID. But, they've already sent it out, right, and those brokers have collected it and it's sitting in their data sets, got it? That makes sense. So if you re-enable your mobile advertising ID, it generates a new one, so then they can start tracking you again, but it's based upon a new mobile advertising ID.

Debra Farber:

I see, I see. So, I think we've been talking about how Avantis can help people protect themselves from having their data bought and sold and how to request purges be made. But what about organizations? What can they do better to protect people's privacy when collecting and using geolocation data?

Jeff Jockisch:

So if you're talking about like a commercial interest, I think that's pretty interesting. We're actually talking with some organizations now about how they need to start thinking about this from a threat mitigation situation. And, especially if you're talking about people that maybe targets in your organization for attack you know whether it's the C-suite or your IT personnel or other vulnerable individuals where you know attackers might come after them, they're going to come after them based upon information that they can find that's publicly available, right? So some of them are actually, you know, going to these data deletion companies and trying to delete all their information on their C-suite employees, their IT employees. Frankly, they should probably be doing it for all their employees, but at least those ones that are most vulnerable. But, right now, they're ignoring all the location data, right, which is a big vulnerability, so they should be thinking about deleting that location information as well. It's a big hole.

Debra Farber:

Yeah, definitely. That sounds like a real security challenge for corporate security. And I do appreciate the answer. But, what I was trying to get at is should companies, for instance, not collect geolocation data in the first place? Oh, I see what you're saying. Or is there a way to do this in a more manageable way? If they do collect it, is there like a privacy enhancing technology or architecture that can be used to better protect the privacy of people?

Jeff Jockisch:

You know, I'm not sure if I've got a great answer to that. I think maybe the best way to be careful with that data is to not collect the precise geolocation, but collect more vague geolocation information, because generally you don't actually need that data to be precise.

Debra Farber:

Is it more like people would want to know what state you're in and that's the level? Or, could it go deeper than that and be county and be okay?

Jeff Jockisch:

Yeah, I mean that's not bad, right? I mean, if you can pinpoint it down to a house level, I'm not even sure what the actual fine- grained level is that's problematic, right? But, if you were to blur that data out so that was within miles instead of feet, you probably would not have an issue with it, right? If it was within like a mile radius and probably if you're in urban or rural areas. But it's not going to really change my weather report, and it's probably not going to change a lot of other things. Maybe it changes it for the people that are trying to figure out what store you're going to, which could be problematic for certain applications. But for a lot of use cases you don't need precise geolocation, but they're storing it anyway.

Debra Farber:

Yeah, that makes sense. That's helpful. I do wonder, and if people here are listening, maybe there's an opportunity to specifically focus on location data. I know Privado, the sponsor of this podcast, actually does quite a bit of making sure that - you could scan your code to make sure that location data is behaving as intended and you're not actually collecting sensitive data when you didn't want to be as an organization or it can put you into risk or harm privacy.

Debra Farber:

But it seems to me that there is definitely an opportunity out there to educate companies on how to protect people and not harm them, especially when it comes to geolocation and precise data. Perhaps it's as you say, that you get not precise geolocation data but work with a statistician or somebody who can take a look at making sure that data is not identified or re-identifiable, making sure that the granularity is not going to harm individuals, and kind of do that threat identification and threat model for the product beforehand, before you ever ship. That will do wonders. And you could refer to people listening, you could refer to the previous episode on threat modeling with Kim Wuyts and learn a little more about the threat modeling approaches and when you should do that. But, I do think that having the right experts on board to help with whether or not this meets privacy bars and not just compliance, but can actually protect the people behind the data is pretty essential.

Jeff Jockisch:

Yeah, I definitely agree. I think if you change the granularity, you could vastly decrease the reidentifiability of the location data.

Debra Farber:

Excellent.

Jeff Jockisch:

And it might not be that hard to do, actually. I mean, if you have lat- longs, you could just lob off the last few decimal points potentially, and you'd be there.

Debra Farber:

Fascinating. Well, it's good to know You don't think it's too complex. Hopefully this is some food for thought for the privacy technologists in the audience. Jeff, is there anything else, any other resources that you would point people to if they're concerned about data brokers or location privacy; or you could also plug your website for learning more about privacy data sets.

Jeff Jockisch:

Yeah, well, I think there's a lot of stuff on Privacy Plan about data sets. I'd also say follow me on LinkedIn; I do a lot of posts there. I also promote a lot of other people there. And check out "rivacy podcasts. I've been a little bit negligent, but I maintain a huge data set of privacy podcasts And, frankly, Shifting Left is finally in the data set, so that'll be awesome. And yeah, I'm going to be doing a Top 10 list here in the next couple of weeks. Debbie Reynolds has been on me to redo that because I haven't released one for a while, and I think there's something like 200 privacy- focused podcasts in my data set.

Debra Farber:

That's just so wonderful. I really look forward to that when that comes out. I've seen the list. And I have looked at the database and I'm really glad that you finally added S hifting Privacy Left. We're relevant!

Jeff Jockisch:

You're actually not relevant. You're actually high up there now too. So, I haven't actually pushed the top lists, but you're in there, so that's awesome.

Debra Farber:

Oh, that's excellent. It's really great to hear. Like I said, I'm really looking forward to seeing that. And what else? Before we go, let me have you discuss a little bit of your podcast, "our bites, your rights. Do you mind just giving us a little overview about the types of things you discuss when it takes place, and how people can participate?

Jeff Jockisch:

Yeah, we've actually been on hiatus for a bit, but Christian Kameir and I are going to be ramping that back up. We actually had an episode a couple weeks ago and hopefully we're going to get back on a weekly rhythm. But, Your Bytes, Your Rights is sort of a weekly interdisciplinary discussion on data rights And we focus a lot on privacy and data ownership and sort of all the stuff that goes around with that. We try to bring in people from not only privacy, but just all kinds of different related disciplines to talk about how we can better own and demand our rights to our data.

Debra Farber:

Yeah, and I've participated in several of those discussions and I've been really enlightening and engaging with other experts in the field, and so I encourage others to attend them. Once you get them going again, no worries taking a hiatus. I took a few off this summer myself. There's just so much going on in the field that we just have to remember that we're human and we can only - we need to rest at times, and you can only get done when you get done, but I look forward to participating again.

Jeff Jockisch:

Yeah, absolutely, you were great when you were on. We just finished up an episode with Tom Kemp about the DELETE Act in California, which may finally have a regulation with some teeth in it on data brokers.

Debra Farber:

Yes, that's wonderful. He's going to be on the show in a week and a half, maybe two weeks actually in the recording times, yeah.

Jeff Jockisch:

Awesome. Well, that'll be great. Well, then you'll have all the details on the DELETE Act.

Debra Farber:

On the DELETE Act, as well as his new book. "Privacy and Big Tech, which is definitely an interesting book. We'll talk a little bit more about that when he's on the show. Then he's also a seed investor for a whole bunch of privacy tech companies. He's a very interesting individual, and I really look forward to having that conversation. Yeah, I love it that we're talking about similar things, you and I, but taking different tacks. You have this town hall style engaging show where anyone could join the LinkedIn Live event and then ask questions or join within that format. I love it. It's all complimentary and really enjoy the value that you bring to the field, especially your enormous focus on data sets. Until you came on the scene, I hadn't seen anybody else doing that hard work of pulling it all together and then making it pretty accessible publicly. Well, thanks for your work.

Jeff Jockisch:

I appreciate that. Hopefully we'll put out some new ones and some big ones soon.

Debra Farber:

Great, Jeff. Do you have any other words of advice or anything you want to leave our listeners with before we close?

Jeff Jockisch:

I think. Just some kudos to you for Shifting Left, loving the new podcast. Thanks for all you do.

Debra Farber:

I really appreciate that very much. It's a lot of work. I got a lot of personal value and pleasure out of having these conversations. Thank you for being on the show. I'm sure I'll have you back on in the future. There's just so much going on in this space. Good luck to you on Avantis and the new consumer location privacy and deletion tool. I'll be paying attention. If people wanted to reach out to you and collaborate or ask questions, is LinkedIn the best place or is there somewhere else that they can go?

Jeff Jockisch:

LinkedIn is probably the easiest way to reach me.

Debra Farber:

Excellent. Well, I'll put a link to that in the show notes and you can have a great day. All right, take care. Thank you for joining us today on Shifting Privacy Left to discuss privacy, data sets, location privacy and data brokers. Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website, shiftingprivacyleft. com, where you can subscribe to updates so you'll never miss a show While you're at it. If you found this episode valuable, go ahead and share it with a friend, and if you're an engineer who cares passionately about privacy, check out Privado: the developer-friendly privacy platform and sponsor of this show. To learn more, go to privadoai. Be sure to tune in next Tuesday for a new episode. Bye for now.

Podcasts we love