This week's guest is Umar Iqbal, PhD, a Postdoctoral Scholar at the Paul G. Allen School of Computer Science & Engineering at the University of Washington, working in the Security and Privacy Research Lab. Umar focuses his research on two themes: 1) bringing transparency into data collection and usage practices, and 2) enabling individuals to have control over their own data by identifying & restricting privacy-invasive data collection & usage practices of online services
His long-term research vision is to create an environment where users can reap the benefits of technology without losing their privacy by enabling preemptive privacy protections and establishing 'checks & balances' on the Internet. In this discussion, we discuss his previous and current research with a goal of empowering people to protect their privacy on the Internet.
Copyright © 2022 - 2023 Principled LLC. All rights reserved.
Debra Farber 0:00
Hello, I am Debra J. Farber. Welcome to The Shifting Privacy Left Podcast, where we talk about embedding privacy by design and default into the engineering function to prevent privacy harms to humans, and to prevent dystopia. Each week we'll bring you unique discussions with global privacy technologists and innovators working at the bleeding edge of privacy research and emerging technologies, standards, business models and ecosystems.
Debra Farber 0:27
Today, I'm delighted to welcome my next guest, Umar Iqbal, a postdoctoral scholar at the University of Washington, working in its Security and Privacy Research Lab. He aims to empower people to protect their privacy on the Internet. Thus, he focuses his research on two main areas. The first is enabling transparency by characterizing data collection and usage practices of online services; and, the second is enabling individuals' control over their own data by identifying and restricting privacy-invasive data collection and usage practices of online services. Today, we're going to be talking about his current and future research. I know there's going to be a lot to talk about.
Debra Farber 1:13
Umar Iqbal 1:15
Thank you. Thank you for having me.
Debra Farber 1:17
I guess, first to know a little bit about you, I think I'm gonna ask you about each of your research themes, and then maybe you could weave into the answers how you got to that theme being important in your career. So, if that's okay with you?
Umar Iqbal 1:31
Oh, yes, that makes sense.
Debra Farber 1:33
Awesome. So, why is bringing transparency into the data collection and usage interesting to you? Why did you select 'transparency' as a focus?
Umar Iqbal 1:43
So, I think bringing transparency is crucial, and protecting user privacy on the internet. And, the reason I am saying that is because when, as consumers, we interact with different online applications - whether they are websites or whether they are mobile applications - we ended up sharing or producing some data for the online applications. And, more often than not, it is the case that this data contains some sensitive information about us, which can be used to infer several things about us.
Umar Iqbal 2:17
So, at a high level, it presents severe risk to our practice. And, on top of that, a lot of modern applications that we interact with are built using a collection of third-party modules. So, when we are interacting with different online applications, these third-party modules or services also get a chance to collect data on us, which presents an even higher threat to our privacy. So, my efforts in the thread of transparency are to characterize the practices of different online services that are collecting data on users; and, I believe it's crucial in protecting user privacy on the Internet. And that's why I work on this thread.
Debra Farber 3:00
Well, thank you for that. That's some really good background and perspective. And, how would you say you address this transparency challenge with your work and your research?
Umar Iqbal 3:09
I think, one example that I believe all of us can relate to is during our interactions with different online applications, we don't get any notification as to who is collecting our data and for what purpose. And, different platforms - you can imagine Facebook here or generally, the web as a whole - there are no interfaces that users can conveniently tap into see what's happening. And, on top of that, the third-party services that have different online applications also collect user data; and in those cases, even the applications that we are interacting with. You can imagine a website here like cnn.com, which is embedding different, you know, third-parties for advertising and tracking related things.
Umar Iqbal 3:51
So, even the application like CNN here won't have any visibility into how the third-parties are collecting and using user data. So, that's perhaps the biggest challenge in bringing transparency. And my approach to address this challenge is to instrument systems. And by instrumentation, I mean to modify the systems in a way that I can observe the execution of different applications, and especially to see what data is being exfiltrated. Also, that allows me to get some visibility into how these applications are operating. And once I have that visibility, I automatically interact with the instrumented systems through measurement studies; and, the goal of these measurement studies is to characterize the data collection and uses practices of different online services. And once I have characterized that, I'm in a position where I can notify users and other stakeholders like platform vendors and redirect and regulators about the practices of all applications so that they can take some actions to improve the state of user privacy.
Debra Farber 5:01
Wow, I just love that you're working on privacy at this level because so much of it has been focused on security to do vulnerability, pen tests, exfiltration attempts - all of that, right? And so, it just delights me to see people like you researching...using similar techniques, but applying it to the privacy space. So, that said, can you tell us about some of your research relating to transparency and data collection and use?
Umar Iqbal 5:29
Yeah, I think I can talk about my most recent research, or where I tried to bring transparency in data collection and uses practices of the Amazon smart speaker platform.
Debra Farber 5:42
Yeah, that'd be great. I'd love to hear about that. My former employer. No, I know this, this reveals some things that are not so pleasant. So, that's okay. It just...it is what it is.
Umar Iqbal 5:53
Yeah. So I think let me start with a bit of background about the smart speakers, and then I can talk about some of the challenges and then my findings.
Debra Farber 6:01
Umar Iqbal 6:01
So, smart speakers, I think, as you might know, are gaining a lot of popularity. So, I was reading this survey like few days ago, which stated that more than 95 million people in the U.S. have put these devices in their homes, offices, and other locations. On one hand, this is great because smart speakers do provide users a lot of convenience. But, on the other hand, the data they collect can cause serious harm to user privacy.
Umar Iqbal 6:29
For example, smart speakers obviously collect user's voice recording, but they also process it to infer or extract the different things that the users are saying to them from the transcripts. And even the metadata of a user's interaction with the smart speaker can leak sensitive information. You can imagine a user talking to a mental health app like three or four times a day, which could indicate the state, that the user is having some mental health problems.
Umar Iqbal 6:57
Given the data collected by smart speakers and the potentially-sensitive information present in that data, you would expect that the smart speaker platforms would provide more transparency and will have really good privacy protections in place. But unfortunately, that is not the case. A lot of prior research has uncovered that there were malicious applications or 'skills,' as Amazon likes to call them, were hosted on the marketplaces of the smart speaker platforms. And on top of that, there is also some research which identified that smart speakers often miss-activate when users don't even interact with them. So, these things and the practices of the smart speakers, if you look at them together, can be a concern for the users.
Umar Iqbal 7:45
So that, I believe, concludes some of the background. And, in terms of the challenges in uncovering some of these practices, is that the smart speakers (unlike browsers or mobile devices) don't provide any developer interfaces or debugging tools. So, it's very hard to understand what's happening on these devices when users interact with them. So, given the challenge of the lack of tools to investigate these devices, I built a framework that didn't require to modify them. And the idea was essentially to interact with the smart speakers and then observe how the interactions (or the data shared during the interactions) has been used in the services which are personalized to the users.
Umar Iqbal 8:32
By 'personalization,' I mean personalization in 'targeted advertising,' because the data that's most dominantly collected on the Internet is for targeted advertising. So, I hear...what's the shared data, I look for the usage of the data in the personalized advertisements. So, we found a lot of interesting stuff, but the most important thing was that Amazon was using the user's smart speaker interactions -at least its meta data - to serve targeted ads to users. And this was concerning because Amazon has not explicitly acknowledged or denied that it doesn't use smart speaker and interactions for ad targeting in any of its policies.
Debra Farber 9:14
Fascinating. And so, from what I've read in your research statement, the FTC in the U.S. and the European consumer organization, BEUC (in the EU) have shown significant interest in your findings. And, there's separately a class action has been filed against Amazon based on the findings of your research?
Umar Iqbal 9:35
Right. So FTC and BEUC in EU, I think they showed a lot of interest because of the potentially-deceptive practices here. And I've held public and private meetings with them on this matter. And you will write separately a class action has also been filed against Amazon.
Debra Farber 9:54
Well, that underscores the need for developing products and services, including hardware, with privacy by design and default. I'm also curious, I know you've also done some transparency-related work regarding browsers and API's. Do you want to tell us a little bit about that?
Umar Iqbal 10:10
Debra Farber 10:52
Any interesting insights or conclusions as a result of that research or is it still ongoing?
Umar Iqbal 10:57
One of the biggest conclusion or the most concerning thing here was that browsers are trying to basically remove third-party cookies; and, their hope is that once they removed third-party cookies, a lot of tracking will go away. But, the usage of some of these techniques and especially fingerprinting means that once these third-party cookies go away, trackers have an established mechanism like fingerprinting that they can use to still track users, which essentially questions whether the removal of third-party cookies will be an effective measure and improving the state of user privacy.
Debra Farber 11:33
Fascinating. Yeah, I think it would also have to deal with the current regulation and what we would need in place to have companies no longer permitted to use other tracking techniques like fingerprinting unless there's an opt-in, but that would take acts of legislation.
Umar Iqbal 11:50
Absolutely, absolutely. And, I think we definitely need legislation in this space and these legislation need technological support to be effective. So, once these regulations...and in fact, some of them are already in place. And, we are still seeing some of these invasive and opaque tracking techniques emerge. So, unless we...
Debra Farber 12:08
...so, enforcement of regulation?
Umar Iqbal 12:11
Exactly - that means technological support.
Debra Farber 12:14
Indeed, okay, that makes a lot of sense to me. And then, I want to turn to the second research theme of yours, which is "bringing control to individuals by restricting privacy invasive data collection." So, that's an interesting theme. I love it. It's a great goal. What motivated you to select this as as your focus, and what is it that you're working on?
Umar Iqbal 12:36
Right. So, I guess once you bring transparency, the next obvious question is "How do you restrict or remove some of the privacy-invasive practices?" Two of the ways on which I touched on a little bit was to inform platform vendors because they are in a position where they can make some changes in the platform to make them more privacy preserving. For example, Apple here is doing a lot of effort; they have removed a lot of identifiers, especially used for advertising, from their devices. The second thing is, of course, regulators. We need regulatory support to rein in and regulate this whole ecosystem. So, there have been some efforts in that space; and they also like want to do research, especially in the future to facilitate data protection regulations.
Umar Iqbal 13:25
But, to directly answer your question about control. So, the third way in which I try to bring control is by building privacy-enhancing tools. These tools essentially don't require any support from the platform vendors and also from regulators. And, this is something that users can immediately deploy or immediately use to protect themselves. And, the idea in these privacy-enhancing tools is to use some machine learning based techniques to identify the practices or data flows which could potentially harm user privacy, and then remove or restrict them.
Debra Farber 13:59
So how do you address this with your research and your work?
Umar Iqbal 14:03
Alright, so I build privacy enhancing tools that are machine learning based. And for these tools, I often have to instrument systems like browsing systems and get fine-grained execution traces of websites, including trackers. And from those execution traces, I train machine learning models, which can automatically detect these flows. And, in this line of research, I've collaborated with many privacy-focused companies, like Brave; and, some of my research is actually deployed in their products.
Umar Iqbal 14:33
I've also tried to incorporate my findings in some of the tools. For example, modern ad blockers like Adblock Plus, uBlock - these tools rely on block lists to block tracking URLs and domains. And, one of the problems they struggle with is that the block lists they rely on require manual maintenance. So, people spend hours maintaining these block lists. So, some of my research is about automating these processes; and, I have integrated some of my findings by working with the block list vendors in the list. So, they are essentially indirectly deployed in almost all ad and tracker blocking tools.
Debra Farber 15:16
That's pretty cool. Actually, you know, I'd love to dive a little deeper into that. Can you give us some examples of successful implementations related to control...you know, and based on your research?
Umar Iqbal 15:27
In one of my research papers, I tried to detect the trackers, which were serving browser fingerprinting scripts, and ad blockers currently really struggle with identifying the domains or URLs, which are serving that code. So, I developed this machine learning based technique, which detected browser fingerprinting scripts based on their functionality with the help of machine learning. And, I discovered, I believe around 2000 different tracking domains, which were serving this fingerprinting code. And, some of those were dedicated third-party tracking services. So, I worked with some blocklist developers. I privately sent them a list of all these tracking domains, along with all the evidence that I discovered, and as a result, they have created this dedicated category in their block list. I'm particularly talking about 'easy privacy' here. It's one of the prominent copies. If you look at that, they have a dedicated 'Fingerprinting' category and they have listed some of the domains that are detected by my project, which was called FP-Inspector.
Debra Farber 16:34
Wow, that's, that's really great work. Okay, so now I know that you this has been work, you've already completed that we've just kind of gone through some of the stuff that you've worked on in the past. But, I know you've got quite a load of current research and I see that you focus on three key areas today. So I'm going to read each one aloud and ask that you tell us about the research that maps to each area. Does that sound good?
Umar Iqbal 17:02
Yep. Sounds good to me.
Debra Farber 17:03
Okay. So, the first one is protecting against online tracking on the web. What are some examples of that or what you're focused on?
Umar Iqbal 17:12
Right. In that particular thread, most of my work has focused on 'online tracking,' especially the online tracking that is conducted to serve targeted ads. And my work here has focused on measuring the prevalence of online tracking and also measuring the effectiveness of existing tracker detection tools like ad blockers, and particularly measuring the effectiveness of the block list that I just talked about, and developing new and improved privacy-enhancing tools, one of which I discussed earlier (the ML machine learning based one). And, more recently, I've been trying to improve the robustness of machine learning based tracker detection systems against the advanced adversaries.
Debra Farber 17:57
Wow. Your second key area is 'early detection of emerging privacy threats.'
Umar Iqbal 18:06
Yeah. So here, I have focused on the more emerging tracking mechanisms, which are specially gaining traction because of the third-party cookie blocking. One of the primary techniques is 'browser fingerprinting,' which I also touched on a little bit. Then, there is 'navigational tracking.' The idea of navigational tracking is to exfiltrate tracking identifiers to the top page level applications. And, more recently, I've been looking at the abuse of first party cookies for cross-site tracking.
Debra Farber 18:37
Awesome. And then lastly, your third area of research is 'investigating privacy issues in IoT.'
Umar Iqbal 18:46
This is the one that I have more recently started working in. My most prominent work here is uncovering the data processing practices in the smart speakers. And more recently, I've been looking at the health and fitness bands that analyze user voice to infer several things about them.
Debra Farber 19:07
Awesome. Well, I think these are all compelling areas of research. You know, I'm curious, if you had your own way, what vision would you lay down for the future, and what research do you anticipate focusing on in the long term?
Umar Iqbal 19:24
My long-term research vision is to essentially enable an environment where users can reap the benefits of technology without losing their privacy. The reason I want to work towards this thread is because I see immense value in the 'responsible use of data' and improving the user's quality of life. I plan to work on that vision for some amount of time, and I will tackle that problem from several different angles. But, two of the most important and promising threads that I want to immediately pursue is to 'enable preemptive privacy protections' and to also 'establish checks and balances on the Internet.'
Debra Farber 20:09
That's awesome. So let's unpack some of those. So 'enabling preemptive privacy protection' - what approach would you take to anticipate attacks on emerging technologies?
Umar Iqbal 20:22
Yes, I think first of all, that's a good question. In enabling preemptive protection, I specifically want to emphasize on emerging technologies. And the reason I want to work on emerging technologies is because the standard reactive approach to security and privacy problems can have serious consequences for the users of emerging technologies. What I mean by this is that a lot of emerging technologies collect sensitive and hard-to-change data about the users. As you can imagine, the collection of biometric data and the data about the users' environment here - all of which, once collected, can be very, very hard to change. And, it's also very hard for users to take it back, especially when it's disseminated at scale.
Umar Iqbal 21:05
Another reason that I'm interested in providing preemptive privacy protections is because a lot of emerging technologies provide an opportunity to make more fundamental design-level changes because these technologies are not wide-spread, and it's comparatively easy to make design-level changes. So anyway, that's the motivation and that's the reason that I want to work on this. And, I think your question was about my approach or my methodologies?
Debra Farber 21:32
Umar Iqbal 21:33
...attacking that problem?
Debra Farber 21:33
Umar Iqbal 21:34
Especially in anticipating attacks. So, I think my high-level idea is to use some measurement-driven techniques, to lay out the characteristics of some of these emerging technologies. And, my hope is that once I lay out the characteristics, I will be able to pinpoint the specific characteristics that an attacker might abuse. And that's when I envision that I will be able to provide preemptive privacy protections.
Debra Farber 22:04
Now, how would you be able to provide those protections in like a time-sensitive manner and like deploy them...like what does it take to take them out of the lab and then be able to like really deploy these countermeasures appropriately?
Umar Iqbal 22:19
I think one of the big problem around...the big challenge that developers and researchers have to solve is that the information they need to understand and mitigate the threats is not readily available. They often end up spending a lot of time in modifying the systems and extracting that information. So, one key innovation that I want to do here is to expose that information as a standard layer in the system stack. So, you can imagine this as being a more comprehensive and detailed version of developer tools. And, my hope is that once that information is available, it will be easy-to-use that information to understand and also mitigate threats.
Debra Farber 23:08
That makes sense. So, how do we ensure that we have strong privacy guarantees, especially around your research to make for a more accountable internet?
Umar Iqbal 23:23
So, one of the big problems right now in the online privacy space is the lack of checks and balances. What I mean by this is that applications currently use user data in whatever way they feel necessary. And, they often end up using user data for unauthorized or undisclosed purposes, which violates user privacy. And, as a result of these violations, there is a growing mistrust from the user side towards online applications. And, it has even come to a point where it's limiting the access to the potential benefits that users can avail from these technologies. For example, I was going through a survey by Pew Research, which stated that, I believe, more than 50%, or at least the 50% of the young adults in the U.S., are deciding to not use certain technologies because of their practices.
Umar Iqbal 24:27
So, my vision to address that problem is to bring accountability to the Internet that is grounded in regulations like GDPR and GCP. And the reason I believe it is a promising approach is because, unlike other security issues where the adversities are hidden, the adversaries who violate user privacy are operating in the open; and, they are in fact well-known tech companies. And, the best thing about this is that they need to abide by the laws and regulations. So, that's why I want to work in this space.
Debra Farber 25:04
Yeah, that is...that is a great point being publicly-known, it makes it easier to hold you accountable, but only if you have the right laws and mechanisms in place to detect it. Yeah, it makes sense, compared to threat actors that are unknowns, and potentially anonymous, and hard to track and all that. So, yeah, maybe...maybe this will be easier than security. I joke. I joke. I know this takes a long time, especially, we don't even have a federal privacy law right now, an omnibus privacy law.
Umar Iqbal 25:35
Yeah, absolutely. I think this space has not enjoyed a lot of attention from lawmakers. But thankfully, over the last few years, there has been some reasonable developments. For example, we at least have two comprehensive data protection regulations. In the EU, we have GDPR. And in California, we have CCPA. And, there are also discussions about promulgating a law at the federal level in the U.S.
Debra Farber 26:05
I'm hopefully on more state laws. I don't believe that we're going to have a federal law anytime soon. But that is just my opinion; we could talk about offline why, why that might be. So, I know you also plan to make effort across two separate fronts in the future to build tools for regulators that detect infringement by violators, as well as to build tools for individuals to be able to exercise their data protection rights. I'd love to hear more about what you're envisioning. Let's first start with building tools for regulators to detect infringements.
Umar Iqbal 26:41
Yeah, so at the higher level, my assessment - based on my discussions with regulators and legal scholars - is that these regulations need technological support to be effective. Regulators currently don't have mechanisms to detect infringements. For example, they don't even know if online companies provide users mechanism to exercise their rights and also what happens once users exercise their rights. Do companies respect them, or do companies ignore them? So, I want to fill that gap by building tools, which can assist regulators in detecting investments. And, at a very high level, these tools will essentially try to instigate some outlawed or questionable behaviors and see if the companies engage in the practices that can violate user privacy. And by engagement, I mean here, an engagement that can be measured automatically.
Umar Iqbal 27:40
I think one example here could be opting out of data collection and observing the network traffic to see if some data is collected or not. Something like that could be direct indication that user has asked the online service is to not collect the data, but the online services is still collecting their data. So, the tools that can allow regulators to instigate these behaviors and then can measure their effect will help them detect infringements at scale. So, at least that's my hope that these tools will allow or enable that.
Debra Farber 28:12
Yeah, I think that'd be really helpful. It provides that visibility, and then companies who are optimizing for privacy and, you know, putting the right data protection processes in place, they would be able to boast about how these tools that you're creating will demonstrate, or provide assurance to their level of privacy and their posture in the world. So, I think that's great. I can't wait to read your research in the future. And then I'd love to hear more about what you envision regarding building tools for users - for individuals - to exercise their data protection rights.
Umar Iqbal 28:49
Right. So, I think from the users...and they currently have to go through the complicated privacy policies, for example, to figure out the mechanisms that different online companies provide them to exercise their rights. And, once they figure out these mechanisms, then they have to engage in tedious communication with online companies to exercise them. And, assuming, essentially, an average user on a daily basis interacts with hundreds of different online services, you can imagine that it would be a very time-consuming task and require active effort on the user's part to exercise their rights. So, I want to help users. I want to assist users by building these tools that they can use to easily exercise their rights.
Umar Iqbal 29:38
One example here could be to act as an 'authorized agent' on the user's behalf and interact with the online services to access the user rights. And, another example I believe I can give here is that users currently may not have enough knowledge to make informed decisions about their privacy. So, I want to...I also want to develop tools or user interfaces for users, which they can rely on to better understand or make informed decisions about their practices. So, I think that concludes my part about building tools for users; but, overall, taking a step back, I believe the tools that I will build for regulators might also help developers in improving their applications. For example, a lot of developers are acting in good faith, and they just happen to include different third parties who infringe on user privacy. So, my hope is that once these tools are available, they might help these developers to audit your applications and...essentially, they will give these developers an opportunity to improve the privacy posture of their applications.
Debra Farber 30:51
That's great. So, you touched upon it a little bit for developers. I think that there's also a need for hackers to better...your ethical hackers that work in organizations, or ones that hack on bug bounty platforms. What advice do you have for them to start looking for privacy challenges in dynamic code analysis?
Umar Iqbal 31:11
I think a lot of ethical hackers currently are focusing on uncovering things at the client side or looking at the unintentional bugs that are placed in these applications. Some of the behaviors that we are discussing are intentional, at least in a way that companies are making money off these practices. So, something that these ethical hackers can do is to instigate some of the questioning of practices by doing some reverse engineering or conducting some A/B studies. So, I think it actually creates a new avenue for them to investigate the services and go for different bug bounties. I'm not sure if companies currently provide bug bounties about the unethical data processing practices, but it would be great if they do.
Debra Farber 32:03
They do not. My fiance is a hacker and he's in the bug bounty space; and, I can tell you, they currently do not but it is something that I champion as part of 'shifting privacy left.' I know we'll get there at some point, but it's a...just, you know, we're about 10-15 years behind security in terms of maturing, and as you said, which makes a lot of sense: it's easy to see when a security guarantee is not being met, but with privacy, sometimes it's intentional, and it's a lack of product by...'privacy by design thinking,' or it's just a, you know, egregious market business practice, like track all the people everywhere and store information about them.
Umar Iqbal 32:48
Yep. Yep. I think that makes a lot of sense. I was just going to highlight a couple of examples that I believe the ethical hacking could uncover.
Debra Farber 32:55
Oh, yeah, please do.
Umar Iqbal 32:57
Yeah. For example, recently, I think FTC sued Twitter about using the phone numbers that users gave them for two factor authentication in advertising. So, I think this is something that the ethical hackers would be able to uncover, and would actually get bug bounties from the companies because I believe Twitter never intended to use those numbers for ad targeting.
Debra Farber 33:20
Oh, interesting. So you're talking about the privacy snafu here is that data collected with consent for one purpose, which was security and two factor authentication via text was then used without permission for a secondary purpose, which was to enrich the advertising data set around individuals.
Umar Iqbal 33:40
Debra Farber 33:41
And so the tough part here, because in theory, it sounds great, like, "Yeah, just go look for this," but what you need to first know is "What is the expected outcome?" in order to then be able to say you've been able to trick the system into an unexpected outcome.
Umar Iqbal 33:55
Debra Farber 33:56
So, I think we'd still need more transparency from companies to get that baseline level of like, "Oh, no, we never meant for that to go into advertising," would need to somehow be a statement or, you know, something you can check against, and whether or not that statement holds true.
Umar Iqbal 34:12
Debra Farber 34:12
Yeah, I'd love to just keep focusing on on how this kind of hacker mindset around privacy challenges will develop over time. I know like regular listeners of my show will probably maybe be sick of me talking about "I have a fiance and he's a hacker," but it is such an important area for us to eventually move into and and have privacy bugs or privacy assurances validated against.
Umar Iqbal 34:37
Debra Farber 34:39
Well, anything else that you would like to share any any research communities or conferences or anything that you think our technical privacy listeners would be interested in?
Umar Iqbal 34:51
Yeah, so I think there has recently been a lot of focus in the role of regulations and protecting user privacy and people are giving it...finally giving it the attention that it deserves. And, there is a recent workshop happening in Boston, I believe, called "Beyond the FTC." So, if folks are interested and think it would be a great avenue to go and also like engage in these discussions.
Debra Farber 35:18
Great. So tell us a little bit about what is "Beyond the FTC." So the FTC is the Federal Trade Commission.
Umar Iqbal 35:23
Right? The name of the workshop is actually "Beyond the FTC: The Future of Privacy Enforcement." It has nothing to do with FTC. It's a group of academics mostly working in the online privacy space, from the computer science perspective, have organized and they have invited people from the law community - especially the ones who are working in privacy - and also the computer science community. And they're grouping together to discuss some of their research and also ask questions about how computer science can help facilitate data protection regulations, and how the synergy of computer science and policy can together help address some of these issues.
Debra Farber 36:06
Oh, that's great. I'm really glad to hear there's a cross-functional forum like that. One of the things I'm even trying to do with this show, Shifting Privacy Left, is to speak to a lot of privacy researchers like yourself so we can surface the insights to a...at least to a technical audience; and, then maybe there's some people who are from a policy positions that are listening to this, but I really, it's mostly even just to industry that I'm trying to surface this to. So, I'm really delighted to hear that there's forums like that that are jumping from academia straight to policy and having that dialogue about, you know, how can you draft laws based on what technology is available today? So that's great.
Umar Iqbal 36:48
Debra Farber 36:49
Well, Umar, thank you so much for joining us today on Shifting Privacy Left to discuss your privacy research on usable privacy for transparency and control.
Umar Iqbal 36:59
Thank you so much for having me.
Debra Farber 37:01
Absolutely. My pleasure. Thanks for joining us today, everyone. Until next Tuesday, when we'll be back with engaging content and another great guest.
Debra Farber 37:13
Thanks for joining us this week on Shifting Privacy Left. Make sure to visit our website shiftingprivacyleft.com where you can subscribe to updates so you'll never miss a show. While you're at it, if you found this episode valuable, go ahead and share it with a friend. And, if you're an engineer who cares passionately about privacy, check out Privado: the developer-friendly privacy platform and sponsor of the show. To learn more, go to privado.ai. Be sure to tune in next Tuesday for a new episode. Bye for now.