#249: Information Fiduciaries: The Privacy Awakens


Welcome to falsely podcast. I'm actually in kazarian on today's show, we have Lindsay Barrett, staff attorney and teaching fellow at the institute for public representation communications and technology clinic at Georgetown University. Lindsey, thank you for coming. Thanks for having me to discuss your article about information, fiduciaries and the privacy framework, and how those concepts can beat together United States right now is at close roads. We have a California consumer Protection Act that is going to go into effect on January first of twenty twenty pushing everyone to have a privacy debate over again, which is I think a good thing because a lot of things have changed since last time we kind of fought Fru privacy reform, and how -nology has changed that were understand privacy, both from government and consumer privacy. And now we have this question of do we pass a federal privacy legislation? Do we let California go into effect and states pass phone laws house is gonna affect the economy? How's the effect consumer rights? There are so many questions that we have to ask ourselves. And there's no clear, one solution that both parties can come to not yet. At least we have a little bit more time left, but I would ask you what is the failure of notice in choice model that we have right now? Oh man. How much time? So in this paper, I kind of go through why what, what are the failures of not just notice in choice. But the other failures of American privacy law and regulation that require a new approach to, to the ecosystem in terms of noticing choice. We, we have this fiction that if you provide people with a boilerplate notice of data collection, you've your data youth and collection policies that. Than they are given sufficient information to weigh the risks and make decisions accordingly when in reality, you know, the volume of privacy policies that we encounter each day and privacy choices. You know, there's privacy settings other things privacy policies, the third confusingly written usually at a reading level, that far exceeds that of the average American, they're written in can confusingly lease their Opik. They often don't disclose all the risks that a person needs to know because the company doesn't can't predict the future or because the privacy policy is hiding what the company doesn't want to tell people and then the on that people just have basic cognitive limitations. That make that make notice and choice, a an insufficient way for people to protect themselves in the online, ecosystem. So. People are generally pretty bad. At ascertaining risks, people, there's something, there's a phenomenon known as hyperbolic discounting, which is we, we tend to opt for a short term rewards over longer term rewards. And, you know, things like using using public wifi to mmediately, log on, even though we know it's about idea. There are so many ways in which noticing choice, does not enable people to and not to mention the fact that the, there's a notice, and then the choice that would imply either that the, the company actually gives you choice or if you say, all right, you know blank blanket privacy policy company, I don't wanna use. I will then go use the alternative to this product or service, that also doesn't usually so in twenty six ways to Sunday noticing choice is non effective way for. People to make privacy decisions and generally protect themselves, online and Europeans were little faster than United States. And they have passed the general data protection regulation. What is special about that regulation? And why don't we just adopt what they did? Yes. So that's basically the part of what I tried to grapple with in the paper. So the symposium that I wrote it for was about the GDP are. And I've been reading about professor Balkans idea of the information fiduciary and. Kind of trying trying to square, the sort of philosophical reasons of why you would want that as opposed to the GDP, our conception of privacy as a fundamental, right? And frankly, now GDP are a lot of great things and an information to share framework isn't. Mutually with the rights and approaches at the GDP are take. So, for instance, a lot of the individual rights, I think you could sweep in under a under the duty of loyalty to the of care, you the GDP are provides for various avenues for individuals to actually vindicate their claims and can get into court, an information fiduciary Bill could have a private right of action could could require the or, you know, provide for both on the FTC in the state. A vindicate claims that kind of thing the GDP are take say very deliberate and strong approach to enforcement, which an information fiduciary Bill would absolutely have to do any privacy Bill has to do if we can't make you know, these sweeping grand pronouncements about how important privacy is without actually providing the incentives for company. These to abide by it. So the GDP are builds on a constitutional right to privacy that US law doesn't have. And, you know, we that in many ways, kind of elevates the conversation in Europe a over privacy. And in addition to giving it a legal underpinning or a stronger legal basis, whereas in the US for consumer privacy, there's there tends to be this narrative of privacy is a good, which means that people should be able to trade it away under any circumstances. There should be known. There is no moral imperative of protecting people when it comes to consumer privacy and what I liked about information. Fiduciaries is it seemed to me to even even without a constitutional right to privacy, private private entities information fiduciaries with the finish? A framework is developed in a commercial context. So it takes. The idea that you have professional performing. They're performing their trade. But at the same time their rights and prerogatives need to be limited in light of the. Incentives that they have to abuse, the vulnerabilities of their clients, and some really we, you know, you, you take that approach to data collectors, who are trusted with people's sensitive information. There's an symmetry of power, and there's incentives to abuse that power, because no one's making them abide by the law, and the law sets the bar, very low, and that adds a moral Valence to the idea of consumer privacy that US law, and the idea of privacy is something you should be able to trade away under all circumstances. Currently lacks, as I understand it permission fiduciary concept would be at step towards more of a European phil- philosophical approach to privacy. So it would take us from a good in kind of Louis forward in direction of this is more of a right? But not be fully there is, is that sort of the claim I making the paper. And I think you know, you can look at it number two ways. I think some sometimes, you know, the paper is also it's, it's grappling with basic realities, but it mostly sets out an ideal set of circumstances, what I think should happen. Not what kind of parameters you need to fulfil within the con- within the house. We have the Senate we have cetera. I think that any any system of regulating privacy. That elevates, the idea of privacy against companies moves us towards privacy, the right. It doesn't create the same kind of textual. Right to both. Privacy and data protection that, that you're a pass, but you also don't need it. So, you know, he you could say kind of in this philosophical sense. Yeah, it moves us more towards a European understanding privacy in that the Europeans tend to think it's important to end. There is a there's a narrative in American policy discussions, that privacy is not. But I think also just by virtue how loaded, you know, oh, GDP are, is taking over American privacy law. It's it's one way of looking at it, it's not necessarily determinative. I see. So you care about more of protections should be put in place versus of a cultural and philosophical thinking that we as a society would have over privacy action I thinking after. I wouldn't say that more and more that I agree with your framing, but we also don't need to say, you know, this makes us more European by virtue of the fact that a lot of people tend to read that and, you know, run in twenty seven different directions of I, I read that. And I think oh, you mean the people who think that protecting people online is important. Great Eva, but, you know, I'm I'm talking semantics, but my guess is it's because the discussion has been so intense and very people are separating into camps, and there's crossfire. So I think yeah. Using trigger words like right is definitely not helping anyone move the needle now. Unless permission fiduciary in Hollywood. Apply in the privacy, framework. What, what, what, what did you propose? So either couple different ideas. So I'm a basing, you know, the whole paper is based off of Jack Balkans, working also Johnston's at train. So they're kind of they're thinking, was, obviously, very influential, and I was building off of that one thing that I think is important is that it has to be a compulsory classification if. Balkans at trained setup or rather they propose, an opt in framework, which I think given the current incentives of the Cosette namely, collect, I ask questions later and regulators are working with both inadequate tools, but also not doing as much as it could that won't be enough to fundamentally reset the balance, given how far skewed it is towards corporate progress and away from individual rights and protections. So I think I. You know, you, you apply it on a compulsory basis, second. Who does it apply to? So if we're if we're talking about in FTC enforceable framework than you, you either have to change what their jurisdiction to typically encompasses or accept the limitations of. No, this doesn't apply to come and carriers. I think that in order for this to be affective for a comprehensive privacy law to be effective you have to apply to the entities collect data period. So. Collects data. Yes. I think there are there ways to, to think carefully about, you know, certain what's one wing for like a forgiveness like you try, once you get a warning you get off for small businesses, but not necessarily completely exempted because, of course, small businesses can still violate your privacy. But yes, I think it should apply across the board. And then after that, I, there's a number of different components. So I think that in general, you know, a fiduciary framework includes a Judy of care duty of loyalty confidentiality, these can be interpreted or implemented in a range of different ways. One of the reasons I like a fiduciary framework is that Judy of care loyalty, a little bit less. Confidentiality is a broader sense of kind of the digital vulnerabilities end. Ways in, which we are at risk in using network technologies. So, you know, I think that a fiduciary framework, cutting, compass manipulation and a like an anti discrimination. Principle, or setup and in considering how how to regulate the way that tech affects us is, is crucial. We're talking about privacy law, but just kind of a sense of privacy as disclosure of information, I think, is too limited considering the ways in which, you know, online services and products impact our lives. So that's one of the reasons why I like fiduciary framework is because it is capable of encompassing, this broader approach to both digital harms and privacy harms, because one of the one of the things outlined in the paper as a problem of our ecosystem is in overly narrow poach to what a privacy harm is so to undo focused on physical, harm or monetary. Harms them, you know, that's starting to shift and we have a better understanding of privacy harm. That's. Pertains to, you know, dignity and anxiety. But I that's one of the reasons why I think that fiduciaries are good approach to that. You article mentions the enforcement mechanism for traditional duties, but how would enforcement off information with dishes duties being done. Yeah. So I think any any privacy Bill in order to really. Kind of reset this very skewed balance has to have meaningful enforcement meaningful penalties. So I think there's a, there's a number of, of issues here one, you know, the primary data protection agency in the in the United States have TC won their, their jurisdiction is, is circumscribed to not entirely include common carrier. So that's, that's an issue or nonprofits. And there's also been some critiques of a cultural reticence to enforce the authority that they do have. So I think you know, if, if I'm designing my, my magical idealized you know, you Topi a privacy law that never happened. I would also probably build a digital agency from the ground up in the world in which we currently live, I would give quite a bit more money, people authorities rule-making authorities. Civil penalty authority to the TC and ensure that they're actually once we set a law on the book saying privacy is important, you know, don't violate this. There's actually going to be someone to, to stand and say, there say to companies and also if you do violate this something will happen. This is not abstract. So I think kind of both building up building up the FTC's role in again. Not, not my ideal. My ideal privacy law. But my privacy law in the world in which we live and other avenues to enforcement are also important. So giving authority to the state Agee's to enforce. I also think a private right of action is is important when we're talking about how, you know, how difficult it is for privacy rights to be vindicated and how, how little incentive companies have to, to respect the law, currently, you know. One one conception. I liked of kind of how to how to from the issue the paper, I read by Chris Hoofnagle, and forgive me my memory failing and two co authors on the GDP are, and what they wrote, was that the GDP are is an attempt to put privacy law on the same on the same kind of level of gravity in as antitrust and corrupt practices law and make companies actually see those in the same light as opposed to this is the kind of this is the law. We have to take seriously, this is the law, we don't, and I think that enforcement real penalties is an absolutely crucial part of making sure that so all right. So one of the last questions I had on this topic was about harms you in your article remember, correctly, talk about, you know, what harms how we used to think about harms and what harm. In your opinion. Now, should be when we apply this model of information fiduciary, and as we move forward in trying to update and inform. Our privacy laws and regulations. Do you wanna elaborate a little bit on that? Yeah. I mean, I think so. One of the things I wrote about in the paper and my reasons for liking the conception of a fiduciary framework, is that it one creates the presumption of they're being duty to you, as opposed to the presumption that there is none. And within that it also creates a broader sense of harm, as pertains to privacy and pertains to, you know, the, the digital risks that we encounter in using online services. So manipulation discrimination things like, you know, Airbnb while black digital redlining. And I think in in understanding how we can subsidize harms, I think, in privacy policy making ends scholarship, there tends to be a bit of a higher bar than is set in other areas to kind of I don't even know convince people that it's real. And I think. It's taken awhile. But I think this is more. It's incrementally. More just a part of our, you know, in, in all in all strips of life computers, aren't really an optional part of living in the world. And people have a better sense of kind of how. How digital harms can impact you? And it's less of a. You know, someone sitting in a room saying, oh, well, you, you don't have to use Facebook. Whatever happens to you is your fault increasingly fringe view, and we've had, you know. Thoughtful, brilliant, academics, thinking and policymakers stinking through these problems for, you know, twenty twenty almost thirty years now, and I'm confident that we're that would be able to craft you know, sufficiently. Both ambitious and meaningful, but limited and not vague. Now vague definitions to the kind of harms at this law would target alright. So stay tuned guys for that. Is there anything as a final thought to rep up your article ideal want our listeners to take away? Yeah. One one thing that I that I do want to emphasize both that the idea of the even though my, my article somewhat counter both of them, you know, you US privacy lodge, PR and information fiduciary sees are three things and they are different. You know, fiduciary approach doesn't doesn't preclude, a lot of the really great approaches at the GDP takes another is that there have been some pretty fair critiques of the idea of information fiduciary that it's. You know, like a vague kind of wishy washy approach to privacy regulation that will create rather strengthen the illusion of that tech companies have your best interest at heart on one of the critiques, that is very well thought out, and I urge people to read it is by lean Konin David Posen, and I regret some of their claims that their paper, the paper was pretty late in my paper was late in the publication process when came out, but I try to incorporate it, and a couple of things that they address is that the, the idea of conflicts is simply to inherent to the business model of companies like Facebook of ad tech companies for a fiduciary framework to really have any impact. My feeling there is that when you're looking at the kind of the. The context of food Sheri's in other frameworks in law, and medicine, those conflicts are. Pretty pretty deep seated and ongoing as well. You know. In medicine, you have the problem of. Pharmaceutical reps. Influencing prescribing practices. You know, doctors recommending a medical devices that they have a stake in. It's not as though the fiduciary. Kind of conception of consumer protection is. Is unable to deal with perpetual conflicts. And the other is that in their paper they point out that in most in most states, I think, in all of them that state law that, you know, corporate law is on its Muslim state level. Most laws say shareholder duties come before anything, wouldn't that transferred to share duties? Yeah, I think it would. But that's why you say in your in your in your federal fiduciary law. And, you know, here or state, whichever, you know, and Thus I preempt, this comes before shareholder duties, I think that's a salvo problem. I see. Well, we're obviously going to link to your paper in the show notes people can read it for themselves, and contextual. My rambling, they can't. We'll also linked to your Twitter profile, you're active on Twitter and you answer questions. So I'm sure that would be a way for them to contact you without overwhelming. You of. Emails of two active. And for our tech segment, I want to start by asking you. How did you end up doing tech policy was it because you were just interested in privacy. And now the next horizon, of privacy is the digital era or was there, some sequence of events that led you here. It was it was pretty serendipitous. I had a wonderful professor for criminal procedure, Julius Sullivan law. And she really made the fourth amendment cases come alive, and I was at my, my one all summer job. That was completely unrelated to tech and ding research for the on campus interviewing process, and realized that a weight privacy is the law that the privacy is an area that lawyers do I can go do that. And then just started taking all the classes and interning places and defining as much as I possibly could. And that happen to coincide also. Luckily, with time when Georgetown law was really starting to double down on its tech offerings and trying to make the school a robust community for privacy and technology and to their credit. I mean I'm biased because I worked there now but I think they've really succeeded and it's just I don't know. It's a it's an area that I find deeply deeply important, but also kind of intellectually stimulating, and creative, and I. Yeah. Haven't looked back and I love it. And that's how I met Lindsay. I think we've, we've known for a little year old in DC or so that's like seven. And in a lot of a wonderful conversations, and exciting conversations that you and I had, we came up with an idea that I think our listeners will be excited about so Liz, in our talking about just the dynamics in the tech world in general, in gender, dynamics, and the difference between Silicon Valley in D C and obvious. Difference between women who are in stem who are engine nears or scientists and women who are going to polcy of there are a lot of differences between the two categories, but also a lot of similarities, because obviously, trainer dynamics don't really change that much on given any issue area or any marketer, any geographical spot on the American map, and broken Saul bad. It's, it's all it's all not amazing. And so we were thinking and we were inspired by one foot woman. Until valley few them who came forward with kind of disclosing information about their incomes and salaries, but they have an creating this community of women who shared that information to kind of support each other and give it to our even just kind of a Mark, or some kind of a leading guide Verizon yet, basis comparising on what they made a many years of experience. They had and how did it all fact the company, but they worked at. So we figured that, which it start our own little kind of similar project to that of or, please, send us information, we're going to contact people that we know women in tech that we know to start the soft and we need allies and men to disclose information to this will be old, the identified, it would have just general formation about years of experience, that you have level of degree kind of just industry or. Sub in category of a place where you work and your income. Yeah, I it's I can't just yeah, it's, it's so important for people to kind of have a any kind of resource to understand where they are in terms of what they're making in what the what the standards standards are because it's very easy. You know, specifics speaking from experience, it's very easy to get screwed over without even realize it's happening to you. And you know, there are plenty of places that are well, intentioned and others that. Less. So, and we want to do whatever it is that we can to ensure you know, stuff out here, like we, we want to give people a way to kind of understand where they are, and have a little bit more insight into what they're worth than what they're what they're able to bargain with. And we are our plan is to have we, we won't release any at some point. We'll have a like a non edible spreadsheet. We won't release it until we have a, a sufficient volume so that it's not identify -able you know, if you have five people that don't matter if you don't have. Like me and you there, but we, we want to ensure people's people's privacy, and we also want to give young people starting and you know, people in their mid grid people, whenever but particularly people who are starting out and don't have steepen network or deepen understanding of what's okay? And what is acceptable to kind of know where they are, and have a sense of. Yeah. With our worth, and what they should be asking for and how they're being treated. There's definitely a sense of community between women who work in DC, and in general women who were seen tuck pulse here, woman who work in DC, nonprofits. And we kind of want to bring that together and not only have this cushions, and now working and just helping each other in real life. But also have some kind of tangible piece of information that generations to come can use update look. Back on as a historical artifact at some point. So stay tuned. We are going to nounce it around the time that so is going to be published. And we're gonna give you ov- information you need to contact us or ask questions and can't emphasize enough. You know, one of the are kind of animating thought behind this was kind of women women in tech policy. But, you know something that were certainly calling cognizant of is. You know, we, we need male L allies to also contribute. You know, people people don't know. They're, they're getting paid a disparate rates until somebody tells them. And yeah, we just pope that you think about. Being part of the mazing. Lindsey, thank you so much for coming. I'm sure this is your first time on the podcast, but not the last and we are very excited about, oh, ho work he'd do at your clinic, and would definitely want to have you back to talk about that. I know Fox News already lawn man, local Phillies local affiliate of Fox News, but we definitely have so many topics to discuss until nineteen is the Super Bowl for privacy lawyers. So we would love to have you, you know, on our team and love to be back. Thank you for listening and please subscribe to touch freedom, and leave us our view offers can find the show. Have a good one. Have a good one. The tech policy. Hogg cast is produced and distributed by tech freedom. A nonpartisan nonprofit think thank in Washington DC. To learn more about our word make a tax deductible donation or find other episodes, this online at tech data dot org.

Coming up next