The Shifting Privacy Left Podcast

S2E3: Fixing Consent & Transparency on the Web with Mark Lizar (Digital Transparency Lab)

January 24, 2023 Debra J. Farber / Mark Lizar Season 2 Episode 3
S2E3: Fixing Consent & Transparency on the Web with Mark Lizar (Digital Transparency Lab)
The Shifting Privacy Left Podcast
More Info
The Shifting Privacy Left Podcast
S2E3: Fixing Consent & Transparency on the Web with Mark Lizar (Digital Transparency Lab)
Jan 24, 2023 Season 2 Episode 3
Debra J. Farber / Mark Lizar

To kick off Data Privacy Week 2023, I’m joined by Mark Lizar, CEO of the Digital Transparency Lab and Founder of 0PN: Open Privacy Network

Mark is also the Vice Chair of the IEEE Cybersecurity for Next-Generation Connectivity Systems' Human Control & Flow Sub-Committee and Editor & Lead Author of the ANCR Notice Record Specification and Framework at the Kantara Initiative. 

In our conversation, we unpack the current standards and specifications for transparency and data control in the digital space. Mark shares some of the innovative solutions he and his colleagues are working on to bridge the gap in web consent. 


---------
Thank you to our sponsor, Privado, the developer-friendly privacy platform
---------


Mark unpacks his interpretation of the open transparency standards, laws, and tech required for privacy to scale digitally. One of the major use cases he’s working on at 0PN is called ‘Do Track,’ which is a response to the shortcomings of the current ‘Do Not Track’ mechanism that we have in place today. The Controller Credential Standard allows users to specify or direct consent, and he shares some exciting examples of how users can use ‘Do Track’ to take back control over their own data. 

Mark breaks down the four levels of privacy assurance achieved Controller Credential Framework and explains what’s needed to gain market traction for this privacy-enabling tech standard. He also gives us a peek into what else they’re working on over at the Digital Transparency Lab and how to get involved with the organization and their efforts..

---------
Listen to the episode on Apple Podcasts, Spotify, iHeartRadio, or on your favorite podcast platform.
---------

Topics Covered:

  • A simple way to understand online consents vs. system permissions 
  • Why it’s important to see who's controlling our data 
  • How the new Controller Credential gives people autonomy over their own data
  • International privacy instruments that can be scaled for local use 
  • A new digital model for representing physical privacy 

Resources Mentioned:

Guest Info:



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Buzzsprout - Launch your podcast


Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Show Notes Transcript

To kick off Data Privacy Week 2023, I’m joined by Mark Lizar, CEO of the Digital Transparency Lab and Founder of 0PN: Open Privacy Network

Mark is also the Vice Chair of the IEEE Cybersecurity for Next-Generation Connectivity Systems' Human Control & Flow Sub-Committee and Editor & Lead Author of the ANCR Notice Record Specification and Framework at the Kantara Initiative. 

In our conversation, we unpack the current standards and specifications for transparency and data control in the digital space. Mark shares some of the innovative solutions he and his colleagues are working on to bridge the gap in web consent. 


---------
Thank you to our sponsor, Privado, the developer-friendly privacy platform
---------


Mark unpacks his interpretation of the open transparency standards, laws, and tech required for privacy to scale digitally. One of the major use cases he’s working on at 0PN is called ‘Do Track,’ which is a response to the shortcomings of the current ‘Do Not Track’ mechanism that we have in place today. The Controller Credential Standard allows users to specify or direct consent, and he shares some exciting examples of how users can use ‘Do Track’ to take back control over their own data. 

Mark breaks down the four levels of privacy assurance achieved Controller Credential Framework and explains what’s needed to gain market traction for this privacy-enabling tech standard. He also gives us a peek into what else they’re working on over at the Digital Transparency Lab and how to get involved with the organization and their efforts..

---------
Listen to the episode on Apple Podcasts, Spotify, iHeartRadio, or on your favorite podcast platform.
---------

Topics Covered:

  • A simple way to understand online consents vs. system permissions 
  • Why it’s important to see who's controlling our data 
  • How the new Controller Credential gives people autonomy over their own data
  • International privacy instruments that can be scaled for local use 
  • A new digital model for representing physical privacy 

Resources Mentioned:

Guest Info:



Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.

Shifting Privacy Left Media
Where privacy engineers gather, share, & learn

Buzzsprout - Launch your podcast


Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Copyright © 2022 - 2024 Principled LLC. All rights reserved.

Debra Farber  0:00 
Hello, I am Debra J. Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans, and to prevent dystopia. Each week we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding-edge of privacy research and emerging technologies, standards, business models, and ecosystems. Today, I'm delighted to welcome my next guest, Mark Lizar, CEO of The Digital Transparency Lab, Founder of The Open Privacy Network, Vice Chair of the IEEE Cybersecurity for Next Generation Connected Systems' Human Control and Flow Subcommittee, and Editor and Lead Author of the ANCR Notice Record Specification and Framework at The Katara Initiative. Mark is a pioneering researcher and engineer in trust for identity, championing notice consent and trust framework standards that govern digital identity and its management. Welcome, Mark.

Mark Lizar  1:09 
Hi, thank you for having me.

Debra Farber  1:11 
I'm really glad that you were able to join us this week. It's a special week since it is Data Privacy Week. So I'm glad to have you on this special episode of Data Privacy Week 2023. But, I know we have a really focused discussion ahead of us, You've been working in the weeds in a lot of the standards and specifications for transparency and control of data on the web; and I just...I'm looking forward to unpacking that with you. So, before we get into the weeds further, if you could maybe give us some of your definitions to illustrate how you're approaching digital identity, transparency, and consent on the web. And the first two are, let's see...you often state that online interactions today are more about "permissions," as opposed to "consent," even though...it's being framed as consent. Can you unpack that for us? What do you mean by that?

Mark Lizar  2:07 
Well, yeah, so that's a incredibly loaded question and it's something I've been looking at, and working on, and trying to figure out for a large part of my career. And you know, now it seems really clear and it's simple in that, you know, systems ask for permissions from people and people provide consent for a purpose - not for, you know, read and write control on a database or a field. So there's, there's two different things going on there. So permissions inside of a system for technical things, is not...is sort of bundled into a consent. And from for what people understand is consent, or for what human consent in which you legally been defined as consent, but technically, we've been able to, you know, been sort of convinced that a permission is a consent, when really, you know, if you don't know who you're giving consent to, you know, right away it's not legal. So, you know, the identity of the service provider would have to be there, and as well, terms and conditions. That's a different legal justification than consent. And that's a big topic, too.

Debra Farber  3:17 
Yeah, go ahead with that. Yeah. Tell me one more bit. So how is that...how were Terms of Service a different model of consent versus the permissions model of a system that you were just referencing?

Mark Lizar  3:27 
I mean, there's a lot of history there. But the short version is recently, the European Data Protection Board gave Meta / Facebook a very large fine saying that you can't put consent into a contract and it's not part of the terms and conditions. So you know, when you click in "I agree" to these terms and conditions, which you have done with Facebook all this time. You know, it's not consent, and...

Debra Farber  3:54 
It has to be separately disclosed consent, and not only in the terms and conditions for use of your personal data. That's the consent we're talking about here, right?

Mark Lizar  4:03 
Right. Well, it's even slightly more different and nuanced than that. And that consent, the terms and conditions are really a contract. So, that's a separate legal justification than consent. So under a contract...

Debra Farber  4:17 
Especially talking about, you're talking like GDPR rules, where you have a legal basis to process personal data where it's in this part of contract.

Mark Lizar  4:26 
That's right. And, you know, International Privacy Day or the Council of Europe 108 Plus, which we started the International Privacy Day, they have a enforceable legal framework, which is, you know, going through a lot of process being ratified in 2023. And, and so, in this international law, there it also defines the use of the same six legal justifications which are, you know, consent, and contract, legitimate interests, for vital interests of the person for a legal obligation, and for the public interest. So there's these six legal justifications, which authorize and provide authority to process your personal data, and so whatever it is in the world, they all can be mapped to those six things. And those six things have rights and obligations associated with them.

Debra Farber  5:20 
That makes a lot of sense.

Mark Lizar  5:21 
So the big thing about contract, right, is a basically...it's a different type of law, so you enforce it differently. So, you have to, you know, take a civil action to go...it's like a tort to a contract thing where you're suing for breach of contract, which is a different mechanism and format for dealing with a problem, right? So it's...

Debra Farber  5:43 
...dealing with harm that someone suffers.

Mark Lizar  5:45 
Yeah, that's right. And so children can't really defend themselves. And there's big power imbalances, and all these good things and bad things about contract. Privacy Law is not the same. You have a regulator, who enforces it and there's the basic rules, Code of Conduct culture, if you will, about how to, you know, respect people's data. And that's really what a privacy law is about. So in that way, you can just go make a complaint. Right? So, it's another mechanism for enforcement. And but it's really something that is intended to be safe for children, if you will. Right? So, you know, privacy law would be something that children could use or access, and parents would give consent for the use of children's data. While in contract, you know, you can, you can just change it any, you know, every, every week, you know. "We've updated our terms and conditions, click this tick box," and it's different. Right? So, privacy law is really consistent. People can learn it, they can trust it, that you know, you can have a comment, standardized waiting to use it, whereas contract can be changed constantly as well. So there's these different different sorts of flexibilities with the different types of law.

Debra Farber  6:53 
Got it. And so in this case, where I think the conversation we're about to have today, we're going to be talking about true consent versus system...true consent that enables you to manage your own personal data, and have true accounting of it and control over it rather than get...than a company...you consenting to a contract or thinking you cosented to a contract, even if in the EU, it's no longer considered consent to put a privacy notice around your data in a contract itself, there's still the element of you are getting stuck in somebody else's server-client relationship, right? You're in their system, and they're giving you permission, based on parameters. Right? So the permissions there are really operated by the systems of companies, as opposed to a true model of consent. Did I get that right?

Mark Lizar  7:42 
Yeah, well, so I've come up with a simple way to explain this to people. And if you just look at this word, "consent" online, when you come across it and replace the word with "permissioning," it will probably actually make a lot more sense to you. Where consent is actually being required, you know, if you change the word "consent" with "authority," then that, that sentence will make a lot more sense. So you know, if you see the word "consent," try the...I use the word "authority" or "permission" on it, and you'll see really what they're asking. So it...and really what's behind it is who's controlling the data and who's setting the rules, right? So when you can send...you can send to the purpose of a service, like, I'm going to use this browser to surf the web, I'm going to consent, right, for it to take my data, enable me to do that, right, when you get to a website. And they're, you know, taking your data or putting a cookie on your device without your permission, without a notice. Right? Then, it's definitely not consent, and you can't just consent, you know, after you're in surveillance. You can't consent to it. So that's, you know, that's really where I think there's a sort of a major issue with the CCPA...is that, you know, it's assuming that you're "opting out" of surveillance into privacy, when that's not really possible. So, there is that ethical point of thinking about it, but it's ultimately about definitely consent to a purpose or something that's free (that's what they say) "freely-given." So if you control your consent records, independent of all the services for example, and you had a record of your own consent instead of being called cookies, and everyone else has the record of your consent, then you could, you know, with one button in one context, you know, press withdraw consent, and it would then set all the permissions and all the systems. So, that's what we're talking about when we're talking about consent standards. We're talking about standards to set permissions across systems.

Debra Farber  9:34 
I love it. I think it's great because all of what you've been working on is open source standards and specifications. So it's really adding to the toolbox for web as opposed to creating, I don't know, another privacy tech company that is selling access to a set of tools - which not to say there's anything wrong with that; I just like how ubiquitous it is and that anybody will be able to take these tools and put them to work.

Debra Farber  9:59 
Okay, so there's so much to talk about, because there's, you know, we are going to get into some of the specifics here, but first, I want to get through some other kind of statements I've seen you make. You've stated that "if we can't see who controls our data, we are not free to consent or trust the services we use." I think you just touched on it a little bit, but do you have anything else to add about, you know, what does free consent mean to you? And, how do you go about trusting services that we use.

Mark Lizar  10:25 
So this was a quote that actually ended up in an OECD paper on The Identity of Self. I think I have to look up the title. But it was quite a while ago, back in 2000 and like 12 (2012), around then, and it was really, if you can't who's watching you...so, if you have a lot of third parties that are getting data about you, then you you can't trust any of the signals you're getting. Right? You don't know who's cheating you or not, right? It's the same as in, you know, physical space, and that's why I think, you know, the U.S. was really leading back in the day when they said, there can be no secret information taken about you, which was a part of the privacy practices until they updated that. But it was about, you know, the secret surveillance. So, when we talk about consent in digital space, we're talking about consenting surveillance, where, you know, you can see people watching you, and you're doing it because you want them to, or you, you know, you're a part of a society, where those are the rules that you have to, you know, everybody has to abide by. And, you know, I think that goes to the root.

Mark Lizar  11:32 
The Canadian Charter of Rights and the American Constitution, you know, is that...you know, is that agency.

Debra Farber  11:39 
I just love talking about this issue, because it's not... I just haven't really heard people really talk about the difference between free consent and, you know, trusting services and how like, you know, just humans trust other humans or other things, you know, the trust that you have, have, I think I've heard you mentioned before, a system being secure is kind of different from trusting that no harm will come to you. It's above and beyond that, and it's a different type of trust of the...of an organization. I might trust the system that a company puts out there without trusting the company generally to be ethical, good, use my data in an appropriate way, or whatnot. And so, but, you know, it definitely underscores, you know, why seeing who's controlling our data is important.

Mark Lizar  12:24 
This goes back to, you know, Nazi Germany when they had this network of Stasi network where the neighbors were spying on everyone. And that was really where, if you don't know, you know, there was a really low freedom, and there was no social freedom at the time. It was really tense period. And I think there's that type of oppression, so that, you know, the big hole, big brother narrative and the control of information. So if you don't have any control, and the systems have all the control, and if there's all these third parties right now, you know, API's, for example, you know, are notorious for giving away your data as you're leaking it. So when you use data, or even use privacy services, it makes you really insecure and opens you up for a lot of risks because you're using digital identifiers that aren't standard, you know, that are used by third parties where you don't know who they are. So, they could be a foreign state actor. And so, you know, so the privacy infrastructure itself is not secure, you know, in that respect. Right?

Debra Farber  13:22 
Well, the web infrastructure wasn't secure to begin with. We had to add that on. And privacy protections weren't really there either, and, arguably, you know, that's still in the works, especially with the work you're doing. So it's no surprise privacy is about 10-15 years behind security. Okay, so Mark, you've also said, "For privacy to scale digitally, it requires open transparency standards, laws and technology." What do you mean by open transparency; and why is this so important for standards, laws ,and tech?

Mark Lizar  13:53 
Oh man, that's such a great question, and I think, you know, really, it explains why it's taken so long to bring this out and to bring something forward in this space is because a lot of these services and everything we use online are closed. Privacy policies are closed. Terms or conditions are closed, you know, the control over your data are closed and a lot of its industry driven. So, transparency, for people has to be a public utility, digitally. So it's taking a really long time for the standards...of the open standards to develop, and a lot of people say that ISO IEC standards are not open. So what we've done...and so I've worked with...

Debra Farber  14:36 
Why? Tell us why.

Mark Lizar  14:37 
Well, you have to pay for them to access what the standard is and what's in it, right. So you know, you can imagine, you know, exclusivity in many ways to even learn the system of how to even contribute to a standard, let alone the big industry that might control...but not really what big industry. ISO - it's big nations. right? You know, they have a lot of power, a lot of people to put behind some standards. You know, it can really... it's really, you know, defining the battleground sort of in the standards space, internationally. So, you know, we're talking about security over privacy, which is like the Chinese privacy approach. And, you know, we have the terms and conditions governance framework, which has really been the American approach to the net. So there's the space - the international space - inbetween governments, where standards have been worked on for a long time. And, January 28th, is the anniversary of the Council of Europe 108 Convention, which has been updated to 108 plus, and is being ratified by a lot of countries. And I think 55 countries are in 108 and 108 plus, which is, you know, more than the EU. And that's enforceable privacy law in International, which I would say is Internet-capable space. So of all the ISO standards, the one we work with, is the ISO/IEC 29100: Security and Privacy Techniques Framework, which is actually free to access. That's enabled us to do this work.

Debra Farber  16:12 
Excellent. I think that we've kind of gone through like a good little whetting of the appetite as to kind of the actual standards and specifications, what particularly you're working on, but first, I want to actually go, you know, and find out a little bit about your journey, and why consent and transparency are so important to you to get right. You know, I know for many years, you've been working on web standards to give humans control for free on the web, right, using standards over how their information is collected and used. Tell us a little bit about your journey. How did you get here working on these standards? And, you know, have committed a large chunk of your career to this particular area of web privacy and transparency.

Mark Lizar  16:55 
Well, back when I was leaving university, I got really excited about the Internet and the web, and this was a sort of 1999 before the.com boom. And you know, at that time, I thought everybody can control their own data. You know, it was just my automatic assumption, and I got a job working in identity management. I quickly learned it had nothing to do with what I learned in university about identity management. So I found myself about eight years, nine years later in England doing a master's in social research on surveillance and doing a lot of work on surveillance and identity, and trying to separate, you know, our human identity from our digital identity online.

Mark Lizar  17:34 
So, back then we started a campaign called The Biggest Lie on the Internet, and it was actually quite successful. It's like a Netflix movie, in Terms and Conditions Do Not Apply, and like a South Park episode where you sign Apple terms and conditions, and, you know, it was a really big issue at the time. And you know, were you clicking on something that you've never read, and now they get to do whatever they want with your data. You know, we rallied against that back in the day, and we had a call to action in 2012 at the W3C Do Not Track and Beyond Conference where we called for standards for open notice, which today we're calling "digital privacy."

Debra Farber  18:18 
This was about 2012. Was it 2012?

Mark Lizar  18:21 
Yeah, so there's a conference in Berkeley where we presented a call-to-action for creating standards for digital transparency, which back then we called "Open Notice."

Mark Lizar  18:37 
So tell us a little bit about that. Like when you were working with other technologists at W3C to bring a Do Not Track signal to market, you know, specification, what happened? Well, you know, I know it was a massive failure, but I do want to...I do want to have you summarize that for the conversation. So, why do we not have a Do Not Track mechanism today that's robust, or that most of the browsers, you know, have incorporated and respect, you know, has teeth, and is respected by the ad chain and so to speak, or the corporate powers that be? What went wrong, and how have you re-grouped to approach this topic?

Mark Lizar  19:13 
Well, trying to work with the advertising industry on a signal was probably one of the issues there, you know, because there's entrenched reasons for people not to be able to choose who's watching them and choose what actual ads they get, you know, through consent because it takes away the entire business model. So, I think, really, we were naive really early on, and we thought everybody wanted to solve this problem. And, you know, in truth, the industry didn't want to solve the problem. So, you know, I think that's effectively how we have this in the U.S. in the CCPA. The default of surveillance, where you opt out of surveillance online, really influenced DO TRACK industry standards collaboration at W3C. So, in 2012, it was like, "Okay, this failed, what do we do now?" and this was before the GDPR, and you know, so there's only a few efforts...

Debra Farber  20:08 
I'm sorry to interrupt you, I do want to just throw out a little tidbit that the Do Not Track signal is a standard, but it's not adopted by most browsers. And it is a requirement. What is it under California law that you have to specify whether or not your respect Do Not Track track signals, but you don't actually have to respect them. You just have to say whether or not you do in the privacy notice. So kind of that's where we are; it's almost laughable. Now, please proceed. You've got this new mechanism you're calling DO TRACK. Tell us more.

Mark Lizar  20:41 
Right. So we have this use case - DO TRACK, and I've been involved in...so, our our work, which evolved to the something called a "Consent Receipt," which is effectively where the reverse concept of a cookie, where you actually have a record of who you're consenting to - your digital identity relationship - that you can use for your own metadata for your own benefit and own that. Right now online, you know, nobody controls their own records and all the services on their records and profiles. So, you know, that's sort of the state of Do Not Track. DO TRACK is where you can, you know, you can specify or to record the consent, and you say, "You can track me for my shoe size and tell me whenever there's my shoes in my favorite color on sale." And, you know, "I'll give you access to this data and I can see you accessing and giving me an ad," you know. That's the type of transparency or digital privacy transparency we've been working on. So, sort of private advertising, where you don't need Google or Facebook or anyone, and you can have...

Debra Farber  21:47 
...or an intermediary of benevolent, you know, not one that's doing the tracking, but it sounds like this is just a web standard. It wouldn't even...I mean, it wouldn't be a separate standalone product you need to buy and plug in, and you know, pay for anything.

Mark Lizar  22:03 
That's right. Yeah. So behind all of that, which is really the big reveal and what we're we're podcasting about is something we've been working on for a long time, which is called a "Controller Notice Credential" or "Controller Credential," and that effectively is all this legal privacy-required information, which is like: Who it is that you're consenting to? How do you contact them to withdraw consent, so you control the withdrawal consent button instead of you having to...if you have to go to a website or service provider and log in, and then change your permissions and delete your account...

Debra Farber  22:37 
Not scalable.

Mark Lizar  22:38 
You're basically...yeah, that's permission, right? So you're going in undoing permissions, right? But, if you control the consent record, then you can withdraw consent by pressing one button, or you can pause consent, for example. So, if you walk into a physical space with 20 services - Bluetooth, Wi Fi, video surveillance, you know, the standard, you know, would be, "Look, I'm going to withdraw consent," and all the services could go, "Oh, okay. This identifier means 'no,'" you know, and that's the kind of where we talk about when we think about consent versus permission.

Debra Farber  23:12 
Okay, so what do we need to gain market traction for this privacy-enabling tech standard?

Mark Lizar  23:19 
Yeah, so market traction, so we're working on...so a lot, a lot of the stuff has been just for industry. So everything in industry, in the commercial space, of course, it seems to be for them. So, it's been difficult to work on public infrastructure, to say the least; and that's why, you know, openness is such a challenge as well. So, the thing we need is, well, what we're working on is open standards that people can use to make companies transparent, whether they like it or not effectively. So, it's basically, you know, enforceable privacy law and standards at an international level, which we're now achieving in 2023. Finally, you know, the slowest possible way it seems possible. But these standards and things I've been working on for 20 years (in some cases, 40 years) are now mature and they've actually become, you know, enforceable privacy law. And they've been harmonized. So, we can now go and see...So what we've been working on at Kantara Initiative is "Transparency Performance Indicators," which are privacy, digital privacy metrics. So, people can automatically see services collecting your data before you know who they are or after - so, after they've given you a notice, and you've consented.

Mark Lizar  24:31 
So, if you're getting the data collected before you know who the person is or the service that's collecting it, you're obviously getting...well, it's not compliant with a lot of laws, but it's also not really as trustworthy. So, when they talk about transparency in the law, they talked about "Just-in-Time Notice," "opt-in" and "opt-out." Just-in-time notices (like just before they take your data), you know, after the fact is, you know, telling people that we've taken your data and this is what you have. So, for people to have a choice, you need to know before your data is collected, and have a choice for your data not to be collected, or to have a choice to say, "Hey, collect the data I choose you to collect," which is really the big, big thing that opens the marketplace up is people being able to choose their data sources. So, at the W3C, there's a workgroup called Solid, which Tim Berners-Lee's champions, which is where people have their own records and their own data storage. And there's a consent community group where the DO TRACK proposal has been performed. There's also a Data Privacy Vocabulary Controls Workgroup, which standardizes the vocabulary for notices and notifications so that it can no longer be gamed and use what we call "dark patterns." So it eliminates dark patterns. And these put together provide digital privacy transparency for people. So, we're basically...our aim is to solve this by, you know, providing open, free for freedom (not free for profit, but free for freedom), a controller credential with these standards, and then enable services on top of that.

Debra Farber  26:14 
Got it. It's a lot. It's a mouthful, but it's...

Mark Lizar  26:18 
I got to come up with a better sales pitch than that.

Debra Farber  26:20 
So what exactly is a "consent receipt and how does the "controller credential" give people control over their own data? You know, why should we be excited, basically?

Mark Lizar  26:32 
Well, so the fun thing about the controller credential is that these are the pieces of data that you should provide if you're going to...well, you're legally required to provide is, you know, who's processing data. The law typically takes an analog brick and mortar approach to say you have to provide an address, but with the new standards and laws, we basically have to provide the digital equivalent of that. So, you know, what's your digital identity? What's your digital identifier? And, you know, that's a big leap, and bridging that digital gap to transparency and in a standard way, you know, provides a lot for people to be able to use. So that's a big leap. And what we aim to do is just...what we've been doing is standardizing the things in those gaps to bridge the gap between the physical and the analog. So people's physical privacy is represented digitally.

Debra Farber  27:26 
And it's represented as a token? Is that understanding correct?

Mark Lizar  27:31 
Right. So those, basically, those identifiers - which are really attributes of a service provider - are the legal entity of the company (the company's legal entity) and the data controller, or their Data Protection Officer for the data controller, which is the authoritative person. So in some laws, for example, in Quebec in Canada, they've just made it so that the owner of the company, the authoritative person, is automatically the Data Protection Officer and that they can delegate that to a third person. So, there's two entities that are combined together to create a controller and credential; and, with that is also the current privacy contact point, the digital privacy contact point. So, when you put those things together in a standardized format, then you basically created a controller credential, and that credential can be linked and attached to any notice or notification. But it can also be created independently of a service provider, because it's using open laws, open transparent standards, and, you know, it's open for the purpose of transparency. So, because of that, people can themselves...or for example, a browser could automatically make controller credentials for every service provider and use those to create consent records and notice...sorry, notice records and consent receipts. So effectively, the credential is the base for a notice record or a consent recept, the text is the payload, and then the context of how it's going to be treated is defined by the legal justification. So, all of that is already predefined with a controller credential because it's effectively a publicly-regulated credential.

Debra Farber  29:14 
I really like that because it brings, basically, not only transparency, but could enable privacy assurance where people outside of an organization like customers or regulators can have a, you know, can test against and see how our consents have changed. Or if there's anything...you know, nothing's going...basically, we're controlling our own personal data that way, and we have more insight into how our data is used.

Mark Lizar  29:42 
There's a secret thing there and you've touched on it; and there's this thing we've been working on, but we can't really talk about it very easily, which is the universal transparency component. So if you can make these records independent companies, then each time you start a session like a cookie, you can read your own record instead of a cookie and you can see that...what has changed with this controller and this service. So, we were funded by an EU project called the Privacy as Expected: The Consent Gateway. So the EU's NGI funding program funded us for that; and what's that's turned into is called "differential transparency," which is effectively just the comparison of your digital privacy state automatically online to tell you if something's changed. So, if something hasn't, it's what you expect, according to your consent. So when we really were talking about is "concentric relationships" and privacy dialogue that actually dictates, you know, access and control and identity management. So, we're talking about using the privacy standards as the framework; and, that, you know, governs what people can do and expect and what companies can do and expect. S,o it's pretty exciting from that perspective.

Mark Lizar  30:53 
Thanks for that. Now, I'd like to kind of turn our attention to some of the work you've been doing with The Open Privacy Network, which you shortened to 0PN?

Mark Lizar  31:05 
Right! Yes. So, this is the secret stuff. So we're at...we're hosting this event on January 27th, and at the end of the event, we're gonna launch a workgroup in The Digital Transparency Lab, which is The ZPN Workgroup; and, ZPN is a zero public network architecture, which is...we're promoting as a next generation security architecture where your personal data is never transferred and service providers never get your raw data. So it's, you know, we're just taking raw data away from the public sphere, in the zero public network. Every entity in the network has a Controller ID and it's only about services; it's not about people. So, people can go automatically and engage with a notice It would have all the privacy information embedded. You can get a record and a receipt and those receipts will be usable as Consent Tokens. So, really, instead of using your raw data to access services and be under surveillance by everything on the Internet, you can now use Verified Credentials where none of the services know who you are, except for maybe your bank or a notary, or a third party trust service that that can manage and monitor service credentials. So, it's basically the sort of reverse approach to what's going on. And then consent...then the main part is that people have the authority. So, it's like, it's really about adding authority to authentication, authorization.

Debra Farber  32:38 
Okay, and then so this particular working group, there's two things. One, I know you're going to be doing an event on Friday. So if you could give details about what that event is and why people should attend. And then secondly, a little bit about...I know you're looking for participants in the actual working group. And so, maybe you can make a plea for, you know, who you're looking for and hopefully, we can get you some more participants.

Mark Lizar  33:04 
That's great. Yeah. No. So, thank you for that. So, next Friday, January 27th, there's a celebration of International Privacy Day. We're actually launching some of this work and we're celebrating the consent receipt hitting a draft technical specification at ISO this month. So, that's all a lot of fun. The Digital Transparency Lab has got a membership program that we're just setting up, which is free, and in it there's services for privacy technology companies to try out the digital privacy tools for Transparency Performance Indicators and to update for free their transparency so they can be more trusted and interoperable with the standards. So that's, that's the Lab and we're we're basically running baseline transparency reports and metrics on targeted industries and contexts in order to make reports; and that's what the digital Transparency Lab is all about. And at the event, where we have a great keynote, hopefully and a great introducer, Sharon Polsky, and we're really bringing together the cybersecurity community, the next generation cybersecurity community at IEEE, and the digital privacy community at IEEE on this overlap of human control and flow and for that transparency over that. So, we're introducing something called a Data Control, Impact Assessment or Data Control Interoperability Assessment, and we're debating where, you know, as a tool for all these experts to be able to talk about digital privacy, transparency and security, and that's what the event is all about on January 27th. So at the event, we're going to be launching The Zero Privacy Network Workgroup where people were interested in an architecture where you don't have to provide your personal data to every service provider. That's what the ZPN is all about. And 0PN, Open, is the service that we're setting up for companies to be able to look at, you know, getting Credential Controller Credentials.

Debra Farber  35:17 
Got that. Okay, and so for those Controller Credentials, you're working to enable Digital Privacy Transparency Metrics, once that credential has been adopted, the Regulated Controller Credentials has been adopted. Can you tell us more about what the metrics can be created? And what metrics can be created for DPOS, GRC teams, regulators and individuals with this rollout of the Regulated Controller Credential?

Mark Lizar  35:41 
Sure. So, I mean, the framework has four levels of privacy assurance, what we're talking about now is level zero, which is the public level of assurance and we're using the laws and standards to produce some transparency metrics, which effectively are "are you notified before your data is collected?" It's also when you're notified? Is there the required information to know, you know, to legally be notified? And is it required security, but then also, how accessible is your privacy access information, which is really I think one of the gaps we found is that the all these privacy policies and services and tools say they do privacy and they care, but they actually don't provide you access to privacy services or privacy information or privacy rights; and, all the laws, they say you have to provide privacy rights information. So, that's like a really notable gap in the market right now. And so, that's sort of the the main purpose is to just provide transparency over the accessibility to privacy information. And then the fourth transparency, performance indicator that's public and open and free is whether or not the security certificate or the token is actually registered and has any security integrity, right. So you know, we've noticed in our research, there's a large proportion of service providers provide a service in one country, a privacy policy in another country, and the security certificate or SSL certificate from a different country under a different organizational unit. And that's basically a basic breaking in security. So without, you know, basic security, you can't even have privacy, digital privacy.

Mark Lizar  37:26 
So these four indicators are just like basic indicators that a tier, what we call "Tier Zero," the public level, but we have four tiers of assurance. So, to go further, you know, we're working on programs with the Security Industry Association, who's really interested in Controller Credentials for video surveillance providers and technology manufacturers so that they can embed those into surveillance systems so people can automatically see, you know, who's watching them. And, you know, you can open surveillance up for a lot more utility for police. You know, people can make a police report. People can, you know, do a lot of self-security, you know, I think is the term we're using right now, which takes a load off the infrastructure, and, you know, increases security for people and services. So, that's the angle we've been working on for a long time, in the technology space.

Debra Farber  38:21 
Thanks for that. What are some of the specifications that you're currently working on? Whether it's 0PN or, you know, across all the things besides the Regulated Controller Credential, and how can those who are interested in those efforts get involved with you and reach out to you?

Mark Lizar  38:40 
Sure. So I mean, the Kantera Initiative, which is one of the few industry organizations that's like open to the public, to be able to jump in and do something is the one where we've done a lot of work. And, you know, it is incredible that we've been able to go into this industry association, you know, lobby for something like against terms and conditions and tick boxes, and, you know, come up with, and be guided to, developing a Consent Receipt, which now, 10 years later, is being voted on as an international standard where people can actually have their own record of their digital relationships. You know, that's incredible - an incredible thing that happened at the Kantera Initiative. So, today we're we're working on standards not for at the Kantara Initiative, but not for industry, but for people. So, we have the Notice Record Standard, which is basically the base format for anybody to be able to just use the...to make a record of what service provider, who they are, who the controller is, and what their contact information is, to see if there's enough information for privacy to be present or operational. So we're, you know, we're writing these tools now for people and for client-side services; and, you know, that's happening at Kantara Initiative, but it's also, you know, we realize having just a format to make a record of consent is one thing, but the language of the purpose of consent is really important. So the W3C Data Privacy Vocabulary Group at the W3C is great because it works on specifically making this legal human language that's also technically exportable to RDF and you know, all these other web technologies. So, it's semantically usable online as well. So that's, that's really important. This is also being extended...plans to extend to the Trust Over IP for a Controller Credential extension for their governance framework. We really see SSI as as a security tool for trust and a great way to expand digital transparency and consent online. So, we look forward to working with them and, you know, in IEEE Digital Privacy, there's a group, which is really strong. There's a lot of commitment by IEEE behind privacy right now. So, you know, there's really open access to get involved in digital privacy. And, of course, the IEEE Next Generation Security Community Group, where we're looking at quantum security, AI security, and, you know, the next generation framework where people are in control of security. So, I highly recommend security-minded people to get involved there.

Debra Farber  41:17 
Awesome, and how can people reach you?

Mark Lizar  41:18 
You can reach me at the transparencylab.ca in Canada or at open0PN.org. Mark@open0PN.org. And I'm happy to work with privacy tech companies and standards in this space. Yeah.

Debra Farber  41:37 
Amazing. I have just one more question for you. And that's, do these standards play nicely with the new Global Privacy Control standard, web standard that's out that...I've had a previous episode where we talked with a Founder of PrivacyCheq, Roy Smith, about his views on how GPC is, you know, it's a good effort, but it's not enough and it doesn't meet all the needs of a global consent across all things web or all things that a company might ingest from you. So, how does what you're working on depart from or play nicely with GPC?

Mark Lizar  42:15 
So we're working with the international. So we, you know, we were quite tightly tied to, you know, international instruments that can be mapped for people to use locally. So they, you know, that's quite a big and super, super important thing for something to scale, internationally. And we really look at, you know, respecting each context, each person's context. And that means that is privacy by default, you know, in systems, but it's also your people being able to use publicly privacy law. Right. So that's, that's, that's really important. GPC is sort of online. It's got this context where you're starting from surveillance and you have to opt-out of surveillance, you know, with "Tell us you want to be private." So that's, you know, we look at that, you know, I look at that specifically as not being safe for children communities. It's default is to extract data, you know, without permission and that's really a bad precedent. I think it's set by industry through the Do Track struggles in California, and you know, a lot of weight behind setting that default. So, that default is not trustworthy, if you will, and it's not compliant internationally. So, I see that there is definitely an issue there. And I think that's a contract-based issue that can be solved with, you know, an international privacy code of conduct, privacy best practices. And we see that the GDPR has really risen the practices of a lot of companies online, no matter where they are. So, we look forward to that sort of playing out a bit. I hope that there's flexibility in the GPC in the future, to update to a privacy-by-default approach

Debra Farber  44:01 
That makes sense. And otherwise, without the privacy-by-default, sounds like they're just different tools in the toolbox to use as we build out web services. So, it doesn't sound like there's, you know, that you can't use both or you can only use one. I mean, they're both official standards. I just, you know, hope we get a lot of eyeballs on what you're working on, so that you get a lot of feedback and just make it even stronger.

Mark Lizar  44:25 
Well, this is, you know, this is underlying ethical issue, right, which is, you know, really, when we're talking about, you know, Meta taking everybody's data under contract illegally - not with consent, and all that metadata, this surveilled data without consent is what Meta's based on, you know, and it's unethical. So, even if you add, you know, anonymization and pseudonymization to unethical data, it's still unethical, right. So all of the data that's been stolen, and then all of the things have been posted by Google without permission; and then, you know, all the surveillance on top of that is is not illegal, it's unethical. And you know, from normal people, you know, you would be held accountable. So that's a big deal. And I think so...when you think about technologies like differential privacy, it can be used on big data to make it, you know, more privacy-preserving, which is the word, but it's not "privacy ethical." And right...so, but you could control your own data. You could have your own record source, you know, at the bank, or you know, via the bank for authentication, and that would be ethical. So the ethical thing is you have a choice to share your data. So, if we all had a choice to share who benefits from our data, our transactions or interactions, you know, could we save the planet? You know, what impact would that have on society if society benefited from our choices rather than just a couple of companies?

Debra Farber  45:56 
Right, that makes sense to me. I'm thinking about how I often say that, you know, there's a difference between anonymity (stay with me, I know this sounds off topic), but where anonymity is, you know...anonymity within a company, like render a data set anonymous. I don't feel like a data set can be anonymous. I feel like anonymity is really about an identity state that one chooses. Right? So, if you're taking a data set that was once identified to an individual, and then it is now...you've removed any linkages to it, well, you've rendered it safe - safe for your use. But, like you said, I don't think there's anything "privacy-enabling about it." It's just made safe for sharing for the company. So, what I'd like to see (and I'm going to kind of close this out on this thought) is companies and people just working more towards "privacy-enabling" technology and less so on "privacy-enhancing" technology, which is really more of a whitewashing word to make datasets safe.

Mark Lizar  46:54 
Yes, to all those people out there, don't use the word "privacy-enhancing." It's an automatic...you know, something we can use, hopefully, with Transparency, Performance Indicators to automatically audit you on, you know, because that's...obviously what you're saying is you have surveillance that is protecting your security. It's not privacy-enhancing. You know, so digital privacy is all about surveillance and it's about recognizing that we live in surveillance world, you know; it's co-regulated and we get a choice about how it's co regulated, you know. And I think that's, you know...let's grow up. Let's grow up. We're under surveillance. We surveillance each other. We're all...identity is about surveillance. It's not about your identity, you know. You identify people so you can know who they are. Right? We all do it. So, you know, it's...I think we have to, you know...we have to come to terms with that and move forward and, you know, take advantage of it, too.

Debra Farber  47:49 
Agreed. Well, Mark, thank you so much for joining us today on Shifting Privacy Left, discussing exciting new open privacy and transparency web standards and architectural approaches. I know I definitely learned a lot, and I will make sure to follow this space really closely. Until next time, everyone when we'll be back with engaging content and another great guest.

Debra Farber  48:13 
Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website shiftingprivacyleft.com where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend; and if you're an engineer who cares passionately about privacy, check out Privado, the developer-friendly privacy platform and sponsor of this show. To learn more, go to Privado.ai. Be sure to tune in next Tuesday for a new episode. Bye for now.

Podcasts we love