Technology and Race with Ruha Benjamin
UNSPOILED is back for season two Paul and amy tackle the one hundred, and now they're making their own list starting with back to school movies. Check out the first episode on mean girls. Right. Now on Stitcher Apple podcasts or wherever you get your podcasts. Are you. One of those people who think it's okay to drive stoned. What's the worst thing that can happen you end up driving below the speed limit? It's no big deal, right? Wrong. The truth is your reaction times slow way down when you're high, you not only put yourself in danger but everyone around you talk about a buzzkill, stop kidding yourself. It is not okay to drive high. So look if you've been using marijuana in any form, do not get behind the wheel please if you feel different, you drive different drive high get a Dui. It's easy to see why apartments DOT COM says they're the best site for renters because with virtual tours, it's easy to find a new space without leaving your place virtually view floor plans and amenities with three detours and explore hd videos and photos including. Neighborhood Images and even drone footage of exterior views all from the comfort of your couch because we know that is where you are spending a lot of time lately they make it that easy visit apartments dot com, and get into a new rental today apartments dot com the most popular place to find a place. Hello, everyone. Welcome to factually I'm Adam conover and for decades we've heard about how technology will change the world right in the nineties. It truly seems to the Internet and revolutionary even Utopian potential we were all told and we believed that would break down the barriers that separate people and information we were told Tech would fix all of our problems it would solve educational access with free massively. Online classes it would eliminate discrimination because no one could tell who was who online. So we would all compete on the merits. We even thought that tech would be strong enough to topple dictatorships remember the Arab spring. The media literally told us that twitter was going to be stronger than decades of authoritarianism from Toronto Cairo and this was back on twitter only had one hundred and twenty characters. Each of those characters was doing a lot of heavy lifting. Well, suffice it to say if you're living in the present that I am no, these dreams has come true a decade. Later, the democratic gains of the Arab spring have largely rolled back and authoritarian governments have proved to be even more adept at harnessing the communicative power of the Internet. The democratically minded citizens are and in America. Well let's just say our Utopia of technologically secured equality has not arrived yet. Let's start with claim that technology would be a boon for education. That's sure would be useful right now right at a time when students and teachers are unable to gather together in the same place. Why? Surely after decades of talk and integration on the idea of online learning, we should be ready for this moment, right? Well unfortunately, we are not because the digital revolution has not come equally to America according to the Pew Research Center about fifteen percent of all households was school aged children lack a high speed Internet connection. You know it's pretty hard to log into the zoom with your teacher when you're stuck guessing the neighbor's WIFI password and the reason for this gap, this digital divide is. That rather than our government exerting its power to make sure the Internet in this country was built fairly we've allowed corporations to do it for us with very little oversight and the result is that wealthier areas and populations have been prioritized throughout the history of the Internet while lower income neighborhoods have had to wait or of never received a connection of broadband Internet at all. Advocates call this digital redlining because just like residential redlining it's at a disparate racial impact. Black Americans are less likely than white Americans to have a broadband connection at home a future of Internet, equality. This is not. Now look. If you listen to the show, you know that being skeptical of the promises, the Techno Utopians is not new to me. It's a recurring theme on the show, but I don't want you to think that I'm turning into a technophobe either I'm not here to tell you that technology is actually bad because the truth is the source of the problem is deeper. The truth is that technology is not produced in a vacuum by abstract thinking techno innovators with their minds up in the cloud. No technology is produced by a society and when a society. is based in injustice and inequity. Our tech ends up reflecting and reproducing that injustice and inequity. So you start with a racist police system. We shouldn't be surprised when the algorithms of predictive policing produce racist results. Why wouldn't they the technology arises from the society? So it follows if we WANNA fix our society, we're GONNA need more than just technological fixes giving children in a broken school system new laptops is not going to be as effective as reforming their school system. So students don't receive a poor education just because of what zip code they live in. How about addressing poverty that those students return home to every afternoon, a free ipad doesn't fill up the empty belly that makes it hard for students to focus in class. None of this is to say the technology is useless but on its own, it'll always be insufficient because social problems need social solutions. Well to talk about how technology, when improperly used and deployed ends up reproducing rather than solving social problems. Our guest today is Ruhaha Benjamin she's a professor at Princeton and the author of race after technology abolitionist tools for the new Jim Code. This interview was so much fun. I found her incredibly fascinating and lively to talk to you're gonNA. Love it please. Welcome Benjamin. Rue, thank you so much for being here. My pleasure. Thank you so much for inviting me. So. Look I grew up with. In the middle of the Tech Revolution Right and with all these ideas of techno utopianism that the Internet was gonNA solve every social problem on the Internet. No one knows you're a dog, the famous New Yorker Cartoon Right Algorithms will. Bring in a world of equity and fairness. The land over the last decade, there's been a lot of criticism at that point of view I i. think you're part of that criticism. It could you talk about that and could you talk about this phrase that you've coined the new Jim Code I'm really curious about this. Absolutely. So there are two dominant stories that we often tell about technology. One sort of goes back to that phrase. Techno. Utopian and that's the that's the coolest that Silicon Valley's trying to sell us, and it's about you know the robots are going to stay. They're gonNA, you know they're going to make things more efficient. They're going to be more equitable all the good things but there's another story that we're also accustomed to hearing the techno dystopia story that. Sells us which is the robots and the technology is gonNA. Play US right. So it's going to destroy humanity takeaway agency, take all the jobs, and so although on the surface they seem like opposing stories like surface is pretty different. When you dig down, they share an underlying logic that technology is in the driver's seat were either going to be held in harmed by technology, but the human agency in agents behind the screen get lost from the narrative and so partly, one of the things we have to begin to do is to tell. Different stories about technology that recoup the elements of power and agency that are already there. But part of the issue that I take is that right now, a small sliver of humanity is doing that imaginative work and doing that design work in that programming work, and so they're world views. Their imagination is being embedded into our physical and digital infrastructures. So part of what we have to do is explode that Manama on and really make it much more participatory democratic, and to do that, we have to think about the existing power relations which. Brings us to the new Jim code which you know will sound kind of familiar for those who say read Michelle Alexander's Critique of mass incarceration which she terms, the new Jim Crow, which itself is a riff off of the way that we've talked about white supremacy in racial segregation in this nation when we term the kind of era of Jim Crow of explicit legalized segregation and so in the same way that Michelle, Alexander's trying to get us to see how mass incarceration continues to perpetuate social and Racial Control Mike Concept of the. New Jim Code is trying to get us to think about how technology continues to do that work that it hides so many forms of inequity under the guise of progress under the shining services of AI systems, automated decision systems, machine learning, etc, and so that code part of it is key that it's coated it's harder to discern. But hopefully now with the growing language as you mentioned in the last few years of people and organisations and movements shining a light on this phenomenon, we can start talking about it and pushing back against it. Oh. My Gosh. There's so much in there that I wanNA. I WANNA. Dive into I. Mean You had this incredible idea at the beginning about how we talk about technology as. Neutral as something that's just coming like a force of nature that we almost have no control over. It's literally the robots is literally algorithms and you're right that dissenters the the people who are making it. which is really, which is really interesting. Yeah. Can you talk more about that absolutely and so I, think part of the to realizes that so much oppression happens out of solutions to things and so mass incarceration. Was a reform from from the penal sort of like just killing people. Now, we're going to hold you in cages. So now, the the the issue is that how do reforms? How things that seem to present fixes things actually reproduce certain dynamics insertive social hierarchies are forms of oppression, and so in that way innovation goes hand in hand with inequity and oppression. So often we associate. Like technological innovation with social progress like that's part of the kool-aid is to get us to believe that those two things are the same. When in fact, innovation has long been gone a hand in hand with all manner of oppression, and so I often think about like the first person who put up a whites only sign in their in their store that was an innovation, right? Later, they put up later they put up a neon sign and. Like the first person who said, you know what? Let me let me create a a colored water fountain like that was a bright idea and like whoever did that like you know it caught on and so the things in hindsight that we think of as so backwards regressive at the time they were innovative, which should get us thinking about what now are we? So enamored with you know the we think Oh this is the next right thing and so that's why often when I when I'm when I'm talking with folks I referred to that better off Ted episode it's titled Racial. Sensitivity. I. Love at show. This is a great pull that show was canceled before its time I was. It was too subversive. This three minute genius clip in which the the company decides to install sensors all over the building but the censors don't detect darker skin and none of the black employees can open the doors. They can't use the elevator they can't use the water fountains. That happens is they when they bring it to the CEO the response is oh. Don't worry we'll. We'll fix this. We'll put a manual water fountain for the Darker Skin Employees. The one with sensors which to me is just so really great illustration of how you know you create all of these kind of splashy things to make life easier, and then if it doesn't work for some people, we retreat back to this. Really iconic. Of the Manual Water Fountain and all over the water and says, this is for black employees because they can't uh else and so again, getting us to think about our assumptions about what innovation is and what it does. Well, would disconnects do for me is I've been working in the past on for instance, we didn't an episode of my TV show called am ruins the Internet and we open that with a critique of techno pessimists or people who say, oh no, this new technology is destroying our minds right Oh our cell relative to our cell. Phones is destroying human society and we made the argument. Hey, same as it ever was right. This is what people upset about every form of technology people said about books Oh that they're a destructive new technology that destroy the the younger generations literally did people said that about paperback books and Train Travel and Telegraph Cetera right and we were saying, no technological innovation has always been with us and there's always been saying the sky is falling but this is the argument that you're making is the same argument just the obverse that it's that hey. technology also doesn't fundamentally change our society. The way we think it does if we've got a built in power structure where some people hold power and some people don't then likely the technologies that that society creates are GonNa just keep Raya fine reinforcing that power structures that does that track for you definitely does it's to the first part of your your comment I was thinking about me that shows like maybe it's like the nineteen fifties. Like a train seen a city training, all the guys have their holding up their newspapers and not talking to each other and it's like just see people never talk to each other it's not the phone. The phones that did it but certainly, this idea that it's not the technology that's inaugurating these swarms of sort of antisocial interact you know antisocial behavior or inequity. It's the larger ecosystem. It's the social inputs that actually continue one era after the other to continue to produce the same predictable outcomes and so rather than just say you know get rid of all the technology. Let's look at ourselves like really let's let's use this black mirror as were to shine a light on what we take for granted about our social order and think about addressing the root issues rather than just trying to throw the technology out. Yeah when we see those stories about like you said the better off Ted episode which was that was like two, thousand, nine or something that was very ahead of its time. It was like five years. Later we started having actual stories like that come out at all the time San Press about about this or that facial wreck racial. FIT facial recognition technology, not recognizing people have certain races, Etcetera We see those stories and say, wow, bad technology but that's not really a story about the technology that's the story about our society and which built the technology. Absolutely. There's a wonderful line. There were the one of the bosses said, this is not racism. It's not racism because we're not targeting black people were just ignoring them and. And and for me, that's just a perfect like just expression of zone, much of corporate diversity culture and and all of the things that has less to do with the sensors in the technology. But really how people out so much indifference continues to perpetuate in. It's not the big bad boogie band behind the stream like let me get. It's like I don't give a damn. I don't really care about them that like. Things that the wheels to keep turning well, and so let's talk about that because that is literally the way you put that and that's a great line is that is I don't see color like. Like I literally don't see it Mitek not the algorithm is designed cannot see it but the the thing about I don't see color or these sort of neutral ways of doing it is that it seems a lot less pernicious than the racism or the the racist structures that we were taught about I was talking about in sixth grade right. So we have an image of like you know the police officers seeking the dogs on the Black Youth and the water fountains like really our imagination of what counts as racism is like the IT requires the white hoods wires a snarly like white men who are out to get. You want the vast majority of racist systems just rely on US clocking in and out like just doing our job like put your head down and just follow the rules and so there was A. Really, great study last year this time that came out that was looking at healthcare algorithms and this widely used healthcare algorithm but basically, it was like a digital triaging system like you you come in as a patient and it'll tell the healthcare provider whether you're high risk or low risk, and so if you're high risk, they use this this particular algorithm to get you more resources to to prevent whatever like bad things are predicted to happen to you based on bomb sort past. Data of people like you and so what the researchers who studied this algorithm found is that it was flagging white patients at a much higher rate to receive these coveted resources like more time with your doctor more like services outside of the hospital is basically designed to help people stay out of the hospital and that black patients who are sicker were not getting flagged for these services and so this particular algorithm in many ways was carrying out the the work of a whites-only. Sign. At a hospital like you can't rise. You're not doing it was through this neutral system right and so when the when the researchers opened it up to figure out like what's happening here why this is it like out to get these black patients this system was in fact race-neutral there was they wasn't keeping track of race it all instead, it was using proxy variable. It was using healthcare costs like how much we have spent on particular patients that was used as a proxy. If, we spent more on you. That means that you're higher risk like you're more likely to get sick but we have a system in which people who need care can't get to the hospital don't have insurance. So lots of sick people they're not getting anything spent on them, which makes the system think, Oh, you know what you're fine your low risk. By sleep by using cost as a proxy for healthcare, need this system was perpetuating this past inequality and projecting into the future under the guise of like a neutral system, and so against black patients for years were not getting these these coveted services because the healthcare professionals were relying on this system called optum and so this is just one of many examples were really we see that race neutrality or as you say blindness can be deadly because we're ignoring the past right the data that's being used to train system. It's it has all of these patterns of of. Both institutional and interpersonal discrimination because both are at work in our healthcare system like individuals. And other ways in the policy and the insurance is structured, but it was using that as if it was just a straightforward reflection of reality teaching this algorithm make more decisions like this. And so this is what we get. When we ignore history are we ignore social inequality in the building of technical systems and the people mill building that Algorithm probably had? No idea what they were there was like, Hey, we're just making a little cost algorithm here. Probably if you told them, they'd be like, no, that was that race Caribbean. Direct. Quote from after this. Study came out. That's a direct quote from like a million articles where they were like we didn't try to do this. This is not this is not what we meant to do. But again, we go back to better off Ted indifference is really one of the main drivers of of racism and so hiding behind the kind of I'm not racist but you know whatever comes next is probably going. Yeah. Are there other examples of this the that stand out to you in almost every every significant of our lives where important decisions are being made those who human beings you know the gatekeepers are outsourcing those decisions and consulting technical systems as if they're neutral in the car in our penal system, it's everywhere from every single stage from who gets police in the first place who's paroled who's you know every single stage we have. A risk assessment tools that are being employed that are deeply racially by. The studies are starting to pile up that have been auditing these. In fact now, during the pandemic, there was a system called pattern that was being used to decide who would be released because you know our prisons are overcrowded and so kids running around. So they're like, okay we got to figure out who to let out so that we can sort of you know deal with the overcrowding, and so some places us this pattern assessment again to the to the side risk like who's low risk or high risk and the people who were deemed low risk were the vast majority were white. People who were in white men who were in inmates people who are homeless were high risk. Black. Men were high risk people with mental illness. So all of the most marginal in this already marginal and oppress population were deemed high risk and so were kept in caged where others were released and the difference with something like thirty plus percent of white men were deemed. Eligible for early release versus like six or seven percent of black men. So this and so pattern. This was during the pandemic but those kinds of risk assessment tools are being used up and down the penal system in our education system there are. Examples when it comes to getting like a home loan or other kinds of like loans, and so any area of our lives that you can think of where people are making decisions about a lot of people at once this kind of new Jim Code Discrimination is happening yet just mentioning housing like it. There seems to be a connection back here like to redlining, for example, which I've talked about you know extensively on our show and on this podcast before But 'cause that was like for me really really learning about the history of it and and and the massive affected had on American society. That's to me the most vivid example of how you know these. How how the structure of our society can be set up in a racist way that you can end up perpetuating without even knowing it today like those the the restrictive covenants that they literally had in the suburbs and like Levittown only Caucasians apply will be given these mortgages of course that was overt racism but then hey, forty years later while now if you're just continuing to operate on, Hey, what is the home values of the neighborhood? What is the neighborhood quote crime or not et Cetera? Then you end up perpetuating that original sin right. And all-round, they had to go to go do their little office job at the bank like fill out the little I love to show people like the actual like bureaucratic forms that go into that to sort of demystify it because it's not like a big bad banker staying standing in front and we're like, no get out of here. We don't want you. It's like a like a little bureaucrat sitting there filling out. This many this many Italians live in the neighborhood this many negroes. Of people were getting welfare calculate calculate. Sorry you can't get a home loan to go in here. So it's like for me the forms of harm that we associate with kind of like these scary men of the past Austin carried out by people again, just putting their head down and doing their job in my name my. Mas Neighborhood in Los Angeles Lamar Park I dug up this flyer from the nineteen forties when that neighborhood was getting developed and housing developers trying to entice families to move there and they up wires that basically said come come live. Here you're. We have beneficial restrictions. Your investment will be secured, and of course, beneficial restrictions was a reference to racially restrictive covenants. So they were basically telling them, you buy your house here you have these covenants that ensure that it. Will stay white the neighborhood will stay white and your investment will be secured. Now I was following that kind of rabbit hole and I learned about a black family that was trying to move into the neighborhood, the wilsons, and when they did the homeowners association rallied around and the what. There was a white family that wanted to sell their house to the wilsons and the Homeowners Association sued that White Family in like no, you're gonNA mess up a neighborhood and it's So interesting when you read like the interview of the main plaintiff from the Homeowners Association says, I'm not motivated by any racial animus. This is strictly on economic you know issue and so even in the nineteen forties, he's like I'm not racist but this black family is not moving into this neighborhood for exercise and then so like that rhetoric that we are familiar with now, it's been with us for a while people don't WanNa own like what the the ill-feeling. But they the economic motivation the idea that racism is productive, it doesn't just harm people. Yeah. It actually garners wealth and status and all the good things of life to those who are perpetuating it and so even this guy back in the forties was like you know what? I don't care I don't give a damn about the Wilson's but you're not going to mess up my my you know my property investment lasting say about this story, the ARC is at least. There was a reverend under rabbi who went door to door knocking on people's. Doors in talking to all the neighbors and was like this is terrible. We shouldn't be doing this family. We should let the Wilson's move in, and so they went and kind of did this like labor of trying to like. On. The folks in their neighborhood that this adherence to white supremacy however, sort of. Hidden behind the Languid, the legalese this lawsuit was not like the values that we should be upholding and ultimately through their efforts and others the homeowners association dropped the suit. The Wilson's moved in there was like a party for them, and so this is just one example like we don't always have to wait for Balazs to change to start to like force changes in our relations in in our own backyard and to me the example of this rabbi and Reverend who were like not having it. We're like we're not waiting for the federal government, but our neighborhood is not going to do this, and so that's really I think a call to action for all of us. I do it does raise the question for me though that we were talking earlier in this conversation about and by the way that's incredible story. Thank you for sharing it I'm glad it had a a somewhat happy ending but but we were drawing this contrast between the quote neutral algorithms today and the sticking the dogs on the on the folks on black folks you know in the old days. But what you're describing is a is a story that doesn't sound too dissimilar to right now I mean. We still have homeowners associations. And there homeowners associations that still use cloaked. and. Economic arguments to keep out black people and guess what we still have cops physically attacking allies ing black people who? Love. Really difference or is it same as it ever was out to me? I think that that about insight is like the key because it's not that hasn't been transitioned from an old timey racism to this newfangled racism. So one of the things that I'm really trying to trace in race after technology is the continuity. The fact that now computer codes are doing this work of coating legal codes have long been doing this. There's all kinds of ways in which this coating racism. This embedding into our systems other tools have been used before we've had fancy algorithms, and so that's again the kind of point we started with is it's not simply the technology that's inaugurating new forms of racism. It's providing a new kind of like twist on something that's been with us, and so one of the things I do is really show exactly how legal codes have done this. But even if we go to something a little less tangible like I think about the. Way that we culturally code are names. I start the book talking about People's first names in how we often use that as a proxy for other qualities about people in its on news as a pretext to open doors and shut doors for people, and so there's a great audit study from about two thousand and two or three in which to economists from the University of Chicago sent out thousands of resumes to employers in Boston and Chicago and what they did was they. Just, changed the first name they changed like some of the resumes had names like Emily and Greg son-like, lakisha and Jamal, and all the qualifications were the same the number of years of education, all the things, and they waited to see who the employers would call back and more. We wouldn't be surprised that those white sounding applicant the names will white signing applicants received many more call backs an end calls of interest and the economists calculated this to mean to be equal to. The assumption that those white applicants had eight additional years of experience work experience that they didn't actually have, and so they received fifty percent more call backs, and so this is a way in which our names code certain assumptions about us for good or bad and people use them all the time to actually like provide opportunities for people. Now, someone hearing bad. The results of that study might think will man humans are crap. Like we want we have this implicit bias discriminating. So should we let computers make the decisions about employment? You know like backs the ship it's like, okay acknowledging are biased and then saying, okay, let's let this AI powered system in which I sit here in front of the screen and it kind of tallies, all of these data points and then what these firms that are selling this do they say now we'll compare job seekers scores. To those of existing top performing employees in our company in order to decide who to flag is desirable hire or who to reject, and so again, the assumption is that this system which is presumably created by human beings and had to be taught how to judge applicants is somehow be more neutral and fact that doesn't turn out to be the case. Like if you in your own company I've been hiring mostly, men are mostly white guys for the last fifty years and that's your base your that's your standard for who good employees is, and now you're judging everyone else according to that. However, you code that in terms of body language, posture accent, and all the things that the AI systems keeping track of. You're likely going to get more of the same and the dangerous is that people actually think that that system is more neutral than say a person looking through resumes in deciding I don't want Jamal working here you know. We'll is a simple way to put it is like. A is. I'm an AIX, but I've played around with May is I've talked to somebody experts on the show you train them on data right? Like a very common form of ai now is a neural network. It's sort of a general learning machine. You give it a whole bunch of data. It sees patterns in the data without you even knowing how it's quite doing it right and then it's able to tell the difference between. An be well if you're training the system on a racist society, right? You're you're saying, let's let's it on every employee in America and you know what? How much money they make and how many skills they have. CETERA well, of course, it's GonNa. You'RE GONNA end up with a that says Oh. Yeah. White people are more qualified than black people because in the system I currently have those are all the lawyers and. paralegals and accountants right. In your it gets the more racist and sexist. It becomes like we were judging intelligence by how closely admires human decision making so like this intelligence and quote is actually like the most racist and sexist version of human. thinking, and so in fact, a couple of years ago Amazon's own hiring algorithm was weeding out women precisely because that their workforce is predominantly male. So it was like seeing these resumes with like you know Laura or Tunisia was like Oh this company doesn't want this click check it out but then once they got rid of gender names it got smarter and started looking okay. This applicant was on the women's Chess Club throw that out this applicant went to Bryn mawr throw that out and so and then it started looking at how Talk about their work. Kind of additives people use in the and we know through other social psychological research that that you know that has a gender dimension to it. The kind of language we use to describe our work. So got even more intelligent. It was like, okay like throw those people out and so eventually Amazon had to recall this whole thing. So Amazon can't get right then we. Pretty wary about like outsourcing these really important decision. Systems we sumer neutral. I WanNa ask you about what we could be doing otherwise but we got to take a quick break. We'll be right back with more Ruhollah Benjamin. WGC You offers a quality degree program that is affordable flexible and even makes it possible to graduate. You can earn a respected Bachelors or Masters degree on your own schedule for under eighty seven, hundred dollars per year fees included plus wgn news low flat tuition covers as many courses. As you complete each term, the faster you learn the more you'll save. So get your sixty five dollar application fee waived at WGC you dot edu slash factually that's WG you dot edu slash factually there are. So many questions that run through your mind when you're looking for a new place, is it close to pizza? Will My neighbors be Weirdos? Should I take my old couch with me or is it time to finally say goodbye it is getting a little musty dusty. These are all great questions but the one question apartments dot com make sure you don't have to ask is whether you've found the Right Place 'cause with more listings than anyone else. They say they make sure that no matter what you're looking for. You will find it. It doesn't matter whether you're upgrading downsizing, settling down bachelor padding or empty nesting. You always know you're seeing the very best of what's available if you don't believe him well, you could try asking the over forty million people apartments dot com has helped to find. Their new homes or other renters who have made apartments dot com, the most visited rental listings website but I do believe me when I tell you still questioning what do that couch it is time to let it go. All right now look I'm gonNA open up the site right now. Okay. So I can give you a personal example of how it looks. Let's see. Let's rentals in my area. Tight him. Zip. Code made it real nice easy. On my what this is, this map is covered in a forest, a little green map things that are each one of them different apartment what this map is covered with a forest a little green. Map thing as each one of them representing a different department, here's a two bedroom. Here's a one bedroom. Here's a studio. Here's a Oh my gosh. If I was looking for an apartment right now, this is a copa of apartment listings incredible and exactly as promised. So if you WANNA see a Cornucopia like that for yourself apartments dot com to find your next place apartments dot com the most popular place to find a place. What does twenty twenty mean for small business you have to do more with less suddenly every single higher is critical, but there are fewer resources to find the right people. Let me tell you I work in TV I do have my own small business in that a large industry and I have done my fair share of hiring meetings over video conference I do not love it well. If you feel similar indeed is here to help indeed, DOT COM is the number one job site in the world because indeed, it gets the best people fast unlike other sites indeed gives you a full control and payment flexibility over your hiring you only pay for what you need. You can pause your account at any time and there are no long term contracts plus indeed provides powerful tools. To make your search that much easier like sponsored jobs which are shown to be three and a half times more likely to result in a higher with seventy three percent of online job seekers visiting indeed each month indeed is GonNa get you the important higher you need just like they have for over three million businesses. So right now indeed is offering our listeners a free seventy five. Dollars Credit to boost your job post, which means more quality candidates will see it fast. So try indeed with a free seventy, five dollars credit at indeed dot com slash factually this is their best offer available anywhere. Don't go to on sites. All right. They're not going to beat this one. Go right now to indeed dot com slash factually terms and conditions apply offer valid through September thirtieth. Okay we're back with Ruhollah Benjamin I wanNA talk about. What we could do differently I you're a science fiction fan. Aren't you I know that you use science fiction your work yeah. You take a class this fall black mirror race at technology and justice. That sounds great. I wish I could take that class. Yeah Fun. It's GonNa be fun well. What is what is Why the focus on science fiction and and and what do you? What does it change? I'm sorry let me take that back. Let me let me add to that question Well, why the focus on science fiction and why is it important the question of who gets to imagine the future that we're going to have that that question of imagination I'm curious about why as we think our way towards a solution, why should say I? You know my earlier work before I got to all this stuff around Ai Algorithms was in the life sciences and so my first book was about stem cell research and regenerative medicine. So I was hanging out with all of these really fantastic scientists that were doing cutting edge work Growing things like heart cells in a petri dish in a lab so that say if you're relative needed a heart transplant rather than having a donor, the idea hope is that one day we can reverse engineer your their own cells and grow them a heart from their own their own cells so that their body rejected, right cool. Like you know it's like out of this world and so like hanging around people that are doing this dislike as their day job. Than anyone else seems like this is like this is science fiction but I realize talking them how many people Were initially, like the their interest in science technology was sparked by seeing like a star trek code or some wild like reading some interesting thing in a book and so like from a young age like those seeds are planted and then they eventually. Follow their pursuits and then some a small slice of those people actually get the opportunity, the scholarships that education you know the mentors, the institutional affiliations to be able to take early You know I those ideas that sparked when they were young and actually get to materialize it in an actual lab where now they're growing actual heart cells are developing a scanner like they had on star Trek to say, okay let me figure out what's wrong with you and so for me, it started with realizing how important imagination is to the things that end up becoming science and technology for the individuals and similarly for me but also like there was a real lopsided investment in imagination. Like I would be one of the few social scientists in these spaces and I would be like okay, that's great. That's nice. Will be able to grow people's Oregon's now what about the fact that so many millions of people can't even get like the basic healthcare like now we're in the middle of pandemic like can't get a test for this deadly. You know this deadly virus like how do we match up this great imaginative, an actual economic investment in these cutting edge things with a lack of investment in some basic like social provisions and social safety nets in healthcare, and oftentimes my questions around that those would be cast as far fetched like Oh we can't ever ensure. Oh. There will always be people who will have to die for the common good like. Mind basic questions about like public health or the common good like that was seen as the thing that was out of this world will never be able to do. So for me, it's like how how can we have so much optimism around sort of biological regeneration or ai and so little imagination and so little investment social ills in our social wellbeing right and so it's that's animates me as like okay right now. Our collective imagination is being monopolized by the people who are able to do all this fancy shit whereas people who just want like to be able to take their kid to the Er when they crack their chin open play riding their bike they're the ones who have to sit there and like hope that thing closes up because they can't afford to get there we'll. That's what all my gosh you're clarifying so much about the techno utopian. World View May because what it is is these are folks who say that our social problems are inherently on addressable, right? They say. Racism equality like all these things. You know you can't fix those two thorny like all like all who's going to what we need to do our technological fixes that are gonNA skip right over them. Right. That'll. That'll that's the easy thing to do. That's the thing I can address. It's Elon Musk silk and traffic. We beat up on him all the time I'll show. But I really do think this is not being gratuitous I. Really think is a great example. He's sitting in traffic going I hate being in traffic I know I'll build a tunnel so that rich people like me can skip the traffic and go right underneath right now first of all I don't know why the motherfuckers by helicopter like he's rich enough to get a helicopter. Why is he in traffic to begin with this problem has already been solved for rich people do Jeff Bezos is just in a helicopter he's trying to build tunnels all over the place. But why can't Elon Musk? Who is like this? He's made his reputation as being this visionary imaginative give this guy who's like thinking about the future of humanity let's go to Mars. Let's have a DA DA why can't he imagine a world with no traffic? Yes. He's traffic as being an unsolvable social is how could you get rid of it? I don't know there's too many people around whatever why can't he amend? Why can't they ask the question? Well, what if we? How could we get to a world with no traffic which would be a world with I think and hope public transportation more walkable cities. Like you know less. People have to drive an hour to get to work because they can only afford to live out in the suburbs. So more affordable cities, all these sorts of questions but that whole sphere of questions which also involved questions of racing inequality are put in the bucket of unsolvable by our logical because they don't have they can't imagine cellino and they're not they're not. They don't have any reason to, and I love that you used that example because the epigraph for my first book people science was from someone I interviewed and she said before figured out a way to get to the moon, can we just make sure everyone on my block can get to work? Tonight. A perfect. But again, you know it's really like this lopsided investment and. Part of what we have to do is really push back on the inevitability of technology solving all of our all of our problems I was recently talking to a group of students they had. They had organized this conference on innovation over here, Princeton, and one of the other speakers was the guy behind singularity university. So he was like beaming in through zoom in, he gave a talk and he was ending his talk and he he was right before me and he said what he said he's telling this auditorium full of students like. Painting a picture very much like a Elon Musk type picture of the future. This is what's happening you either get on board or you get out of the way you're irrelevant you either signing onto this figure out a way how to navigate, and so it was like the the the inevitability of the future that he is invested in and so the first thing that I said when I got up, there was like he's wrong. He's wrong. No you do have a choice that is not inevitable because that's part of the that sort of anti democratic. Underpinnings of that whole thing is that they really don't want to hear from anyone else like this is the vision of what the the the collective good is and anyone else who raises questions about a critiques. It is painted as anti technology are anti science, and so the last thing I'll say about that is that even the the sort of phrase that we use, we call people luddites miss recognizes who the led lights actually were. They were not again -nology they were against the social costs of technology they were against the fact that the inclusion of this technology into know industry was going to push down the wages was going to have all of these social costs. So they weren't against the machine itself they were against the way that the machine was actually going to reproduce power and inequality, and so even that kind of you know insult that we cast around. A luddite it actually, you might actually be more. Inclined to say, yeah, maybe I am because it's not being against something saying we need to talk about the social costs, these technologies and do better. Are there technologies though just because we've been, we've been shit on technology this conversation. Are. There technologies that fundamentally do shift the balance a little bit I think about how you know our new communications technology have allowed people who were you know took down all the media gatekeepers right and now people who were were formally marginalized are able to be loud spread their message connect now. And then on the side effects is a lot of people who were kept out of media who we don't like you know like the fucking. Like. Oh, like overt like white nationalist, right? Didn't you have a platform and now they do as well. So there's there's a given a take, but but it is there are there any technological advances in your view that did make a positive change? Fear. There's quite a few and the way that I would characterize them as it is not even simply that the technology is the thing that's so radical or subversive were laboratory, but it's like before you even get to designing particular technology you have to identify who is this for what is what is it for Hal? Is it actually going to intervene in business as usual and the subversiveness or the power of a given technology to do that starts well before you start coding well before you start designing programming and So, it has to do with the question that we posed that technology is supposed to answer because of vast majority of especially when we're talking about these risks assessments in automated decision systems, they cast their view on the most vulnerable populations and they try to predict, for example, the riskiness of of youth to get in trouble at school or someone to be on parole or someone to not follow their their meds or something. So it's looking at individuals who are already vulnerable the technologies that I find to be so. Important, actually flip the script and the direction of the technology or the data collection prediction to those who wield power to those who are monopolizing producing risk for the vast majority. So for example, there's a great project called the Anti Eviction Mapping project. So going back I, think you've had desmond on. So and so rather than try to have some kind of risk assessment that tells the landlord. Okay. The renter default on their loan, it puts the tool in the hands of renters and people who are experiencing housing insecurity to actually look and to judge real estate owners and landlords to tell them how these people are treating their tenants and then to be able to mobilize and rally people together in terms of Housing Justice Movement. So Again. The technology doesn't save people. In that case, it gives them the tools and the data to be able to look at where how the trouble is being produced, and then to to move in that direction as a second example along those lines. That's more of a parody example but I like it because it really shows the absurdity of so many of our tools that are used by policing and our car cerro system is called the. White Collar Early Warning System. So it's like this this system where it flattens all of the places in cities were white collar crimes are happening and it has an that. Image of a prospective criminal and the when they designed the algorithm that produces that facial recognition system they use the profile photos of seven thousand corporate executives on link in, and so naturally, the face of a criminal is white male, and so here you have this data mapping, you have this facial recognition system and it's throwing in our face. The fact that the this exact set of crimes populations always go under the radar they always go through the tunnel to go back to your and they are the subject of this. This kind of you know surveillance, you all my that version of the citizen APP you know. Rather. Than citizen at that says, Oh, there was an altercation with a knife at you know a couple of blocks away and you're like, well, I'm not there. So who like Oh but I'm scared now for no reason I want the citizen at that's like, yeah someone's embezzling down Yeah Beware Zeynep. There's a landlord illegally converting an apartment into an AIRBNB. Let's create the techno dystopia for those in power like let's. -nology if we're going to use it in that way in both of those cases in many others, it really starts well, before you start designing to think about how we what we think of has the problem that technology is supposed to solve and too. Often the problem is the racial is community or you you know the the same old kind of problem spaces and so technology needs to subvert that and we have some examples about down the I. Well. What this makes me think I and going back to your point about the lid is is that your argument isn't anti technology it's it's anti techno utopianism anti these sort of views that some has about technology that technology is neutral and it's going to solve our problems But I think you'd argue technology is a is a tool what we need to solve our our social issues and we can use technology as a tool to do that. If we're if we're mindful of it absolutely as long as we keep technology in its place as long as we don't think that technology is going to save us. As like one half of that narrative and so really like putting technology in its place, not as the kind of magical six but as a tool but also recognizing that it doesn't mean that any given tool as neutral because if the point about tool is to calculate the risk of you know someone who's been locked up in a in an unjust system than it doesn't matter who's holding, it was designed to calculate the risk of those individuals it's oppressive, and so it has to do with rethinking that design process. So we can produce tool that can be used in ways that empower communities rather than oppress them. What is the I? Did I be wells just data lab would you tell me about that? Sure. So it's here at Princeton in African, American Studies Department and IT'S A it's an umbrella initiative that connects students, researchers, artists, and community activists in order to design just tools and so over the course of the summer, for example, had ten teams of students working from everything. There was a housing team in a work team in policing team in each of the teams collaborated with a community organization to build some kind of data justice tool that could be used in the context of advocating for some anti-racist. Initiative in the context, of Cova, and so it's a space to create those connections where academics aren't seen as having all the answers like we need to also humble ourselves learn from people who are working in communities about what's actually needed I. Think this those for technologist to I think too often the kool-aid of Silicon Valley. Is Assuming that they can come up with the the interventions Without talking to the people who all of these things are supposed to help right and so and so part of it is really creating an environment that that can happen. So for those who are interested, we posted all of the the tools from the summer at the just state alab Dot Com backslash tools, and you can take a look at what's what's developed over the last few months. It's. Sorry I I a need to edit that moment. Now, I apologize Water. Our free to finish and get back into That sounds so cool. I, want to ask. What do you advise for folks listening right when when they're you know engaging with technology, right? What questions can they be asking about? It to help them improve their relationship with it and and sort of see these systems a little better. Yeah. I think I think what I found in the last few years many more people who aren't necessarily working in the. Tech. sector. Have become rightly skeptical about the promises that were so commonly sort of marketed to us for the last twenty years or so and so I find like the average person I've talked to are thinking critically at engaging these things in basically not taking things at face value like always with like, okay. What's really happening and so I think when it comes to the data that's collected. Behind the screens behind the scenes in terms of all of the things that we use for free as the saying goes if it's three, then you are the product. Your your data is the is the product, and then so I think I remember few weeks ago. zoom made this announcement where they said that people who use their services for free zoom had the prerogative to sell our. You know our all of our communication and data to law enforcement as. But people who paid they add their data would be protected, and so for a week there, my lab decided we not using zoom. Platform but the outweigh the public outrage was so like route and vigorous that within a week zoom reversed course instead. Okay. Okay. Okay. We won't sell, and so that's an example of US collectively voicing a what we want out of these things and not sort of assuming that we just have to submit submit submit when we press those forms like if we think that something is you know is not right when it comes to you know what? My colleague Sean Zubov called surveillance capitalism. For example, we need to speak up this is true save for parents right now, the more that classes have gone online remote. Learning Learning Mitt, learning management platforms like find out what the school that your kid goes to what they're doing what their policy is. Their data policy is about all of that time that your your your kids are on. You know whatever the learning management platform is and I will say that young people in particular are becoming more savvy around this. I know about a year or two ago. There was a students in Brooklyn staged a walkout out of their school, not around the data issue but because they were spending all their time on this learning management platform and saw their teachers like twenty minutes a week. So they were like, this is not education, and so we're not we're not doing this where boycotting and so that's another example. But in both of those examples, you see it's working together. It's like it's not simply a thinking about ourselves as individual consumers like, okay. I'm not going to buy this product. I'M GONNA go to here. That's fine like people should do that. But the more powerful change happens when we team up when we organize like those students in Brooklyn or like the public outrage around zoom, and so I think more and more we really need to stop thinking of ourselves as users because as I say in receptor technology users get used and so. About our relationship to technology as you know really as stored as citizens thinking about holding accountable, what values do we want to be embedded in these structures? Because if we say and do nothing, it's really going to be the same old kind of corporate surveillance values that you know that we see as the kind of dominant ethos of. Surveillance Valley and so we have to we have to voice our our our outrage when it's warranted and we need to be able to articulate proactively like what kind of ecosystem do we want technology to be designed in? What do we want the social inputs to be? Best as in in collective. So finding like your local, just data organization teaming with them too. I really liked that because. We're so used to seeing tech companies as being kind of like. Of Society you know, and for the first twenty years they were, they were all insurgents and these these weird small companies that were making you know really groundbreaking technology and a lot of them had you know there don't be evil type slogans or were they seem chill and they seemed like you know they're the breath of fresh air coming through and and now these the most massive companies in the country, right that have the most entrenched advantage and we need to start looking at them. I think you're right not as users who who are just like clicking a button, but we're members of society. And those companies are also part of our society and what's what sort of relationship do we want to have with them and how much power what kinds of power a wall right with them having exactly and it feels like that conversation is starting to happen I mean just seeing the antitrust hearings that were happening on Capitol Hill couple weeks ago was I mean that would have been unimaginable five years ago and it was still. Not Quite enough but maybe were sort out. Progress Yeah. Absolutely. I mean for me. It's been a dramatic shift just in the last few years in terms of just kind of being like. Oh. iphone seventy feel of millions out. Let's stand in line overnight like to being like people being much more like you know Savvy skeptical about all all of the shiny things, and so I think that Sir, I, love the way that you described it in terms of recognizing that that little image, the image of kind of like the little outliers innovators in their garage like now, the silicon six, these big companies not only are they like the biggest entities, Kinda, monopolizing power and resources in this country. But many of them have networks that are larger than many countries in. The world right and so like in terms of the power that they wield a monopolize, we really are have to are culturally like under put them in a different category of actor and understand what an influence that they're having on public life. But behind private doors like their decisions have such a huge impact. We, got a completely shift up the regulatory infrastructure, the accountability, and and maybe even ask like, do we want them to be that big? Do we want to continue monopolizing even as they failed to pay billions of dollars in taxes every year so they say they're doing all this stuff in the name of the greater. Good but they don't actually put their money where their their slogans are in terms of paying back into the public good and so that's a that's like a basic one. Oh, one thing we need to be demanding in terms of their their Ya. I'm one of those that really struck me from a Tim Wu's book the cursive bigness about. A monopoly in the history of antitrust is that like the original idea when we talk about monopoly and those issues, we talk about them in terms of money a lot of make they make prices higher and they have too much money and and you know inequality and things like that. But the real, the original argument against them was about power was that a single company. Having so much power which about the standard oil or whatever. Right that's so much power. They have more power than the government and then the van, the democracy, which means that it's inherently anti-democratic and we would sat in America we don't want a single person who's the CEO of this company to have that much power and that is happening again with these with look at Jeff. bezos right and how much power he wields over. So many different sectors and that's the question we need to ask is not it's not just economic also power exactly I and the last thing I would just add to that is that you know these companies in these individuals they recognize that the tide is turning they recognize the shift in public discourse I mean even if. We just go to the Cambridge analytica scandal and you know the two, thousand, sixteen, Brexit and election. So part of their reaction, this is something we have to be very wary about an and keeping a vigilant around now they're trying to what I think of as domesticate critique they're trying to create in house these methods of accountability ethics in all about hiring people like in my field. FACEBOOK created this board to oversee what it does and and some of my colleagues rightly called it like facebook Supreme Court, because they're are trying to create like in-house, what really needs to be independent and third party, and so they know that we we'd won't accept the status quo anymore but we have to be careful about what their solutions are to it, which will just be them kind of creating their own mechanisms of at least at least giving sort of a face of ethics. Sore we're trying to be publicly accountable and we need to say enough with all of that those slogans in all of that, we need something independent. That's in the power of people to be able to govern not in house in terms of these companies attempts to do that. Yeah we need a voice to like this is this is a society and a democracy, and so democratically, we should all have a voice in how are data is used and who's wielding power and these issues. I think that's absolutely right well I I can't thank you enough for coming on the show. There's been such a awesome fun conversation. My pleasure is great to talk to You Adam I. Hope once I get another book dead I. Hope You invite me back. It's really. Learned, I've learned so much from talking to you and I. And other our I would learn just as much. Thank you so much. Live pleasure. Take care. Well thank you how Benjamin for coming on the show I really hope you enjoyed this conversation as much as I did if you did please leave us a rating or review wherever you subscribe I know I know every podcast says that it really does help us open up apple podcasts opened up spotify open up that Google podcast give us a five star review. If you like to show if you WanNa, send me a comment about what you might see on the show in the future why shoot an email, the factual Adam counter dot net, and I will be happy to take a look that is it for us this weekend factually I wanNA. Thank our producers, Dana Wiccans, and Sam Rodman. Are Engineers, Ryan Connor and Brad Morris Andrew W K for our theme song. You can find me at Adam dot net or at kind of wherever you get your social media. Thank you so much for listening. We'll see you next week on factually.