21 Burst results for "Gebru"

Google Workers Launch Union To Press Grievances With Executives

Daily Tech Headlines

00:44 sec | 3 months ago

Google Workers Launch Union To Press Grievances With Executives

"A group of over two hundred alphabet. Workers announced the formation of the alphabet. Workers union these workers committed to setting aside one percent of yearly compensation to union dues. These include contract workers temps and vendors which make up more than half of all workers at alphabet companies. As of september thirtieth. Two thousand twenty alphabet employed one hundred and thirty two thousand one hundred twenty one people. The union will be chaired by google software. Engineers perot cole and chewy shaw in the new york. Times op-ed information of the union. Coal and shaw said google's contract with the pentagon to use a in project naven the firing of a researcher timid gebru and forced for claims of sexual harassment as alphabet management ignoring the concerns of workers in the op-ed. The union pledged live by google's former motto. Don't be evil.

Workers Union Perot Cole Chewy Shaw Google Gebru Shaw Pentagon New York ED
"gebru" Discussed on Stuff Mom Never Told You

Stuff Mom Never Told You

05:45 min | 3 months ago

"gebru" Discussed on Stuff Mom Never Told You

"Like you said earlier it is so important that keep all women speak up when they are in his treated that they don't you know we use our voice. We also cannot deny the very real back that doing that comes with a cost. And it's interesting to me who has to bear the brunt of that cost because we know that when you speak up when you blow the whistle when you are honest about something that's happening to when you're honest about work or anywhere it it's not. It's not easy. And i think it's harder for people who are already marginalized. The expectations are so low for those privileged people and that's the understanding. Is that because the expectations are so low. They're rewarded so quickly so easily. And that's in itself such crime to me like i say that because it's the most upsetting this is where the social worker juvenile justice person been like. I'm in raised the what's happening and there's nothing you can do because it's a systemic thing. Once again we try to explain having to explain over and over again. Y you sit here and you complement the white men for doing the minimal. Meaning i trust women. God you know what i mean. And then the black women sitting out here doing the work going into the concept being a part of the abuse and constant harassment but they the conversation is the kind of ask for it because they're putting themselves in the limelight. That's the conversation you have your. What the hell is this conversation itself. We have to. We have these hurdles again. And i know this seems so simple fact that that this is the level that we're still at this point and people can't grasp i mean and it's the bigger question. What does all of this mean especially for women of color in tech what is and how do you unravel this. I'm getting really passionate. Can you tell the voice. I mean i'm i right there with your passion. I think that we really need to have some tough conversations and the kind of conversations that you are sparking right now. I think that we do need to ask. You know when someone who has more privilege steps up are they being rewarded in a different way than what someone who was less-privileged. I also think just the reality is that or a lot of women of color in all workplaces. Not just tech this common. I've definitely experienced. I've got experience what she's going through but i've experience similar things. There's this really really great comic graphic that i love that you might have seen on twitter called the quote problem woman of color in the workplace that i think describes this system that i've heard quite a bit about where a woman of color in the nonprofits base will get hired. There's going to be a brief honeymoon phase where everyone is so excited that she's there and then if she starts calling.

twitter
"gebru" Discussed on Stuff Mom Never Told You

Stuff Mom Never Told You

04:30 min | 3 months ago

"gebru" Discussed on Stuff Mom Never Told You

"For ruben as a quote good bipartisan. For him no mention of course was made about the troops and his resignation. His agreed sexual harassment. While at google interesting that you know if you if you write a paper and go through a review process that google doesn't like you are terminate your or out in this like wildly disrespect it you sexually harass your coworkers. Give you a million dollars. it's a gift. That's the double whammy once again. Performed of bullshit of them saying yes. We definitely got rid of the person we would not stand for sexual harassers in our community. But we're gonna give them less money under still like us. That's performance of justice that it's constant in these types of organizations and for those they hoped would remain silent. Boy were they wrong. They goodness they were wrong. They really thought they could just silence people and be moved. Just move on with it. And that's like the bigger picture of what she's done. She's going like no on top of the fact that the performance stuff is like. We're going to have a conference that i read this that they were trying to have a zoom conference and have people answering questions so you can all be on the same page and just be really transparent without her presence so answering for the woman they fired like that's just a whole other level of how stupid you get trying to fix things. Yeah i feel like everything they've done has been a bit of a misstep and honestly who knows her or not. It's it will be so it will be so much easier for her to be silent to not say anything not speak up and honestly kudos to her for or choosing a different path and yes so she actually addressed. Sort of exactly what you were talking about on twitter. She wrote if you talk about toxic workplace conditions. A lot of leaders will want you out if a lot of leaders want you out. They will find a way to make it happen. If you're a harasser that's not the case. A lot of leaders will be a okay with you being around. And i think that really just goes i mean. That's the name of the game. And and i think like i don't know if you remember a few back in the summer when so many different people were being pushed out of their jobs because they were. They had a history of racism or sexism. It seems like every day. We woke up somebody else's getting fired when those people reside. I do think it's important to ask follow up questions of well. They're residing but you they still get stock options. Are they look the resignation. Pag are they getting are they. What is their financial. Are you severing all financial ties with this person. And what does that look like. It's so easy to see the headline. This is racist. Boss is is fired from company but then asking that question. Well do they get like. What's our exit package. Look like you know. Do they. Do they are they. Are they getting paid out. Lots of money money. That could go to you. Know hiring more inclusive folks on your team at. What does it look like. I think really ask. Those questions is really important right..

google ruben twitter
"gebru" Discussed on Stuff Mom Never Told You

Stuff Mom Never Told You

03:09 min | 3 months ago

"gebru" Discussed on Stuff Mom Never Told You

"Me that that's where she had to be an the other alternative for them. Was she just leaves and has someone else to come on. And maybe it should be an obvious because it's very political in that sense of we just wanna yes person. Yeah i think. That's i think i mean i as i i'm not in my head ferociously. I've situation where. I think that it looks good to have an outspoken critic in the space joined google to have that kind of coverage say like oh well of course were worried about where we're concerned about this. Google be hired this person who is so outspoken. But then when that person starts asking questions about the organization or asking questions about google role in perpetuating things that could be harmful. I think that's where people were like. Oh well we wanted. You have made us a good. We don't want you if you're gonna make us look bad like ask hard questions and over google. Like their side of the story jeff dean. He shared google side of the story and he said that at jeb. Ruin her colleagues. Only google a day to do an internal review of the paper before they were going to submit it to a conference publication so he wrote quote. Our aim is to rival. Peer reviewed journals in terms of the rigor and thought when this and how we review research before publication however william fitzgerald former google. Pr manager said. No no no no. That's not don't buy this on twitter. He said this is such a lie. It was part of my job on google. Pr team to review these papers. Typically we got so many that we didn't even review them in time or researcher published and he wouldn't know until afterward we all caps never punished people for not doing proper process so it does seem like something is going on both of these. Things can't be true. It can't be true that the only issue was that we were concerned about the review process. We didn't have enough time to review. It therefore weren't able to have them publish it and that google never requires this kind of you know this kind of a review of me and it just really makes me ask was she. Was her paper getting a different kind of scrutiny. And the other kinds of paper that google was putting out right and and it's particularly frustrating too because we have heard of this environment at google of sexual harassment and toxicity on top of all this right. Oh yeah let me. Just be really clear. Google has had a very big problem with sexual harassment sexual misconduct and something even more disgusting is that sometimes they actually in their own words gifts. The harassers huge sums of money last year or the twenty thousand google employees around. The world walked out of the company's office to protest. The fact that google paid out over one hundred million dollars a multiple executives accused of sexual harassment in the workplace. Google pay total of one hundred and five million dollars to andy rubin and emit signal after they were accused of sexual harassment at the company. So this is from tech crunch..

Google jeff dean william fitzgerald twitter andy rubin
"gebru" Discussed on Stuff Mom Never Told You

Stuff Mom Never Told You

04:50 min | 3 months ago

"gebru" Discussed on Stuff Mom Never Told You

"You talk about. Why exactly she was fired from google to begin or in their words. She resigned which is untrue. But why this happened exactly exactly so a little bit of background on wednesday december. The second she was then. The co lead of google's ethical team. She announced on twitter that google and forced her out and it was a conflict over a a a different paper than the one that we were just discussing thousand and eighteen and that she had co authored at at google. So jeff dean. The head of google told colleagues in an internal memo which has since been put online at the paper quote. Didn't meet our bar for publication. And that jacob ru had said that she would resign unless google bet a number of conditions which google was unwilling to meet. She said that she asked to negotiate like an exit date or last date or her employment which she got back from vacation but during that time she was cut off from her email. And so you might be asking like what exactly could have been in this paper. That was so horrible that google had to fire her again. I'm going to pretend that. I'm smart enough to summarize paper itself however thankfully the actual smart people who know what they're talking about how technology review obtained a copy and summarized it so they got this copy from one of the papers. Co-authors emily m bender. Who was a professor of computational linguistics the university of washington and there's a great summary going technology review dot com. You can find it a little bit from. What they said was abor so one. The paper was getting into a is negative impacts on things like climate change so they they point out that climate change as we know a big burden of climate change is faced by marginalized underrepresented people and so if you are amassing tons and tons and tons of money to study i you could actually be having a negative impact on people who are already negatively impacted by climate change. Another was they were really concerned about is use of racist and sexist language bias. This is a little bit like in the weeds. But essentially if you you use if you base on our already existing lang like racist or sexist or bias language that language can be sort of encoded into your a and so you know. I don't think anybody really. I don't think anybody working. At google sets out to create technology that pushes racist and sexist ideology or bias. But we have implicit bias. That you don't we don't always know where our bias is our and our where things that we don't we can't really see are something i might read might beco- that's fine but somebody else. A different background. Read that and say oh actually that sexist or racist or problematic or xyz reason and so when you have teams that aren't able to look out for this kinds of things because they're not as inclusive if they should be. It's a problem right. So one of the co-authors study when asked what the goal of the paper was. She says we are working at scale where the people building that things can actually get their arms around the data and because the upsides are so obvious. it's particularly important to step back and ask ourselves. What are the possible downsides. How do we get the benefits of this while mitigating the risk and so it sounds like to me..

google jeff dean jacob ru emily m bender university of washington twitter
"gebru" Discussed on Stuff Mom Never Told You

Stuff Mom Never Told You

01:46 min | 3 months ago

"gebru" Discussed on Stuff Mom Never Told You

"But first we have a quick break for word from our sponsor this episode of stuff. I never told you is brought to you by infiniti. Make your car buying experience as convenient as it is luxurious during the infinity winner sales event with infinity. Now infinity has rethought every step you take to get into a vehicle so each aspect.

"gebru" Discussed on Stuff Mom Never Told You

Stuff Mom Never Told You

04:16 min | 3 months ago

"gebru" Discussed on Stuff Mom Never Told You

"By our good friend bridget. Todd thanks for being with us. Bridget always a pleasure. Thank you for having me back and happy days by the way happy holidays to you. Youtube are y'all got to get into some holiday fun. I know what i think. So i cannot when i get presence and and you can attest to this like when i received the presence that i'm about to give people i almost can't wait so like i've been kind of slowly handing out gifts even those not really christmas time. I did this for my posts carrier. And i was like oh chris two weeks away but i went ahead and gave it to was like. I'm a little eager. So yes that is my holiday. Shenanigans were already in the holiday season. you know. i think. Get this acceptable anytime. Yeah that's fair especially now right. Maybe this will help before. But yeah i realized yesterday i was doing was like this thought who is away from christmas. I probably should slow down. See i'm like i'm normally much more holiday spirit. I i don't know if it's even that. But i guess because i go out more. I'm doing more stuff and now like oh yeah i guess i should be doing this holiday stuff and i'm in my apartment so it's like i don't know what the holdup is. I put my little tree right you. Bridget have you done your holiday. Starting traditions man. I am i have to say i. M look a little bit of a grinch historically not big into the holiday. Is this time around. I don't know what. This is the first time i've ever it's weird. I have always hated celebrating the holidays. And what a as it creeps in. And you know after halloween when they start the christmas music. I'm rolling my eyes at the time it had been like i want. I want things to be a little bit. I've already started watching lifetime movies where you know. A busy city woman beats a hometown holiday..

Bridget bridget Todd Youtube chris
"gebru" Discussed on Daily Tech News Show

Daily Tech News Show

04:59 min | 4 months ago

"gebru" Discussed on Daily Tech News Show

"Twenty-one ultra phones. The twenty one is shown in light purple with a centered hole punch camera and minimal front basil. The ultra was shown with a curved screen and quad camera system. Samsung unpacked is expected to announce these phones sometime in mid january cruise automation. The autonomous vehicles subsidiary of gm is now testing driverless vehicles on public roads in san francisco california with the goal to secure permit to launch a commercial service crew says its test will eventually expand its driverless testing area adding more complicated environments over time and eventually also removed that safety operator from the vehicle. Google ceo sundar. Pichai sent an email to google staff wednesday about the departure of ethics researcher. Tim nick gebru. Pichai wrote quote. We need to accept responsibility for the fact that a prominent black female leader with immense talent left google unhappily. He said they should review the circumstances of her departure and examine where they could have improved it and lead to a more respectful process on twitter. Gebru points out that the email did not say google. Sorry for what they did to her and calls on google to take responsibility uber announced it will sell. Its uber. elevate flying taxi business to jobe aviation jobs developing all electric vertical takeoff and landing vehicles. Job will use uber's app to offer air taxi rides as soon as twenty twenty. Three and uber will invest seventy five million dollars in jobe in addition to the fifty million that it invested earlier this year. Yes so if you want our analysis on this story go earlier in the week. Find the one about uber. Selling their autonomous driving cars unit to aurora substitute. Jobe for aurora. It's kind of the same story all right. Let's talk a little more about this. Big lawsuits got all right. The us federal trade commission filed a lawsuit on wednesday allegedly that facebook illegally maintains quote. It's personal social networking monopoly though Excuse me through a years long course of competitive conduct. That's a direct quote. Evidence was gathered in cooperation with the attorneys general of forty six states. The district of columbia and guam specifically called out as anticompetitive practices or as those practices are the acquisition of instagram and whatsapp an api restrictions on software developers Ftc is asking the court to order facebook to sell off. What's app and instagram. Cheese prohibit anti-competitive conditions envelope burs and require notice and prior approval.

Pichai Google Tim nick gebru Gebru jobe uber sundar Samsung gm san francisco aurora california Jobe twitter Ftc facebook guam instagram district of columbia
"gebru" Discussed on Daily Tech News Show

Daily Tech News Show

03:29 min | 4 months ago

"gebru" Discussed on Daily Tech News Show

"Twenty-one ultra phones. The twenty one is shown in light purple with a centered hole punch camera and minimal front basil. The ultra was shown with a curved screen and quad camera system. Samsung unpacked is expected to announce these phones sometime in mid january cruise automation. The autonomous vehicles subsidiary of gm is now testing driverless vehicles on public roads in san francisco california with the goal to secure permit to launch a commercial service crew says its test will eventually expand its driverless testing area adding more complicated environments over time and eventually also removed that safety operator from the vehicle. Google ceo sundar. Pichai sent an email to google staff wednesday about the departure of ethics researcher. Tim nick gebru. Pichai wrote quote. We need to accept responsibility for the fact that a prominent black female leader with immense talent left google unhappily. He said they should review the circumstances of her departure and examine where they could have improved it and lead to a more respectful process on twitter. Gebru points out that the email did not say google. Sorry for what they did to her and calls on google to take responsibility uber announced it will sell. Its uber. elevate flying taxi business to jobe aviation jobs developing all electric vertical takeoff and landing vehicles. Job will use uber's app to offer air taxi rides as soon as twenty twenty. Three and uber will invest seventy five million dollars in jobe in addition to the fifty million that it invested earlier this year. Yes so if you want our analysis on this story go earlier in the week. Find the one about uber. Selling their autonomous driving cars unit to aurora substitute. Jobe for aurora. It's kind of the same story all right. Let's talk a little more about this. Big lawsuits got all right. The us federal trade commission filed a lawsuit on wednesday allegedly that facebook illegally maintains quote. It's personal social networking monopoly though Excuse me through a years. Long course of anticompetitive conduct. That's a direct quote. Evidence was gathered in cooperation with the attorneys general of forty six states. The district of columbia and guam specifically called out as anticompetitive practices or as those practices are the acquisition of instagram and whatsapp an api restrictions on software developers Ftc is asking the court to order facebook to sell off. What's app and instagram. Cheese prohibit anti-competitive conditions envelope burs and require notice and prior approval of future mergers and acquisitions similar lawsuit was filed by new york attorney. General lati- lead Excuse me latifah james. I believe i said joined by the same states and territories. That is a hardcore lawsuit. Tom merritt gigantic first. Let's just say upfront. The chances of them actually getting a court even if this went all the way to the supreme court to agree to make facebook divest itself of instagram. And what's up very low not impossible but it's low so you know. Don't don't get too excited if you if you want instagram and whatsapp to be spun out. It's doubtful that will happen What they'll probably is some kind of reparations. Kind of fine. Maybe a change in business practices. I imagine they could have a good chance of getting facebook to have to agree to the notice and approval of future mergers and acquisitions..

Pichai google Tim nick gebru jobe Gebru uber sundar aurora Ftc instagram Samsung gm san francisco Jobe facebook california latifah james twitter Tom merritt
"gebru" Discussed on Reset

Reset

08:01 min | 4 months ago

"gebru" Discussed on Reset

"News broke last week that google allegedly pushed out one of their high profile employees and ai. Ethics researcher named tim need gabriel. She's a well respected pioneer in her field and one of the few black women leaders in the industry. Here to unpack this controversy and explain why so. Many people are so upset. In the tech community assuring ghaffari who reported the story i history. Hey tony so what's the big deal. Why is everybody up in arms about this. Google employees. That recently resigned. So the google employee in question or former google employees is really well respected leader in the field of artificial intelligence specifically about whether artificial intelligence which is increasingly kind of the future of companies like google and facebook great whether the those systems are actually biased toward Minorities women and whether it's really fair what this. Ai is her spitting out at us and told me like what happened. In the weeks before gabriel's departure that has made this such a kind of rallying moment for diversity advocates and people in the field. So here's what we know about. What led to this moment. I we know that. Gabriel and her colleagues plan to present a research paper outside of google at an academic conference about artificial intelligence natural language processing and that means kind of ai. That's used to understand how people speak both writing and audio. They had this paper in the works. It was set to publish at this conference as a procedure thing. They sent it to google higher ups for approval and google denied that and now google's management says it it's because the research was an up to their standards but gebru contested that and set an internal email. Too many of google colleagues at the company was silencing the most fundamental way possible in her words and that more broadly as one of the few black women. who's a leader in the. I team that your life gets worse when you start advocating for underrepresented people at google. Then her senior manager director. Google jeff said that he came out with his own kind of email internally which has now been published online saying that gabriel in his view. Push yourself out the door because she allegedly gave them a set of conditions and google didn't meet those conditions that abe relief so it's sort of at this point A back and forth. There's some debate about exactly what those final conversations looked like leading to gabor's departure. But it all stems back to whether or not this controversial research was allowed to be published outside of google. All right like what's the big deal. Why did google care so much about this. One researcher publishing this one thing so dean in his no and dina's they. I leader to employees sad that the research didn't represent enough fully the consensus in the academic community around bias. In this kind of ai. They were looking at and that there are actually ways to mitigate the kinds of harm the could result from from ai systems and that the paper didn't acknowledge this well enough without actually reading the paper. It's really hard to make an outside assessment of that but there have been many in the academic community that are sort of questioning this and wondering whether google's arbitrarily enforcing its rules here. Yeah i mean this on earth this fierce backlash to google. Tell me what the response has been like over the last week as more and more of this come out. I mean. I've never seen this level of backlash amongst kind of academic leaders over the departure of a single person at google. There's certainly been employees activism at google in the past and there have been really controversial firings including just in the days before this news actually issued a complaint saying that google was wrong to fire to employees who are organizing around other kind of hot button issues at the company so this is not totally new to google have controversy within its workforce but just the breath of support that gabriel is getting from her peers. Not just a google but in the academic community is huge petition to support her already has over twenty one hundred signatures from google employees including many many people on her specific research team and then also over three thousand academics nonprofit leaders and industry peers as of december eighth. Gotcha so i guess the question left wondering is why right. I mean this is one google employees. Why do you think this has touched such a nerve in silicon valley. I think because all of these major tech companies. Google facebook microsoft amazon are all saying that they wanna make sure that the technologies of the future do not negatively impact minorities do not negatively impact. Women that they are fair to everyone regardless of what you look like but one of the leading researchers. who's kind of checking and investigating matt. Who lent a of credibility to google's research department because she was an active voice on these issues is now leaving under these questionable circumstances. So i think for everyone the field. Who's supporting her. They're doing so because they're worried. That a herb being allegedly pushed out could have a chilling effect on these academics. Within major tech companies grew are publishing research that can sometimes potentially be at odds with the business needs of these companies. So there's sort of a fewer that this is a harbinger of things to come. Yeah exac blaming one using berkeley computer. Science professors specializes in. This just said that he doesn't know what the place is moving forward for people in in this field of research companies. If there's going to be this kind of alleged suppression of research and again contests this. But really i think people academics trust in. Google is being tested with this. You mentioned google has a history of this stuff. You know there have been a lot of that. Company has retaliated pushing out employees. Who speak up about ethical issues at google sexual harassment or censorship. Issues when you zoom in on google which is in this sort of sensitive position right now where they have this. Famously political workforce that google has now trying to kind of rain back in right. Tell me like what this tells us about google. Just that these problems aren't going away. I mean actually. The past year had been relatively quiet for google around employee activism. If you think back to two years ago you had the google walkout which was like twenty thousand google employees denouncing sexual harassment at the company but in the past year actually google's biggest criticism had been from outside from the government from antitrust regulators. Google's internal workforce thinks that somewhat calmed down. But i think with this. We're seeing a revival of some of that anger and controversy at the company about internally. How they're managing things shrink affari. Thanks so much for covering that and for coming on you can read more of her work at recode dot net. Thanks study looking at a new team before the new year. Give yourself a gift of the perfect. Hire with lincoln jobs. Who wants to give you fifty dollars to help. Find the right candidate for your job. Opening linked in jobs matches your open role with qualified candidates with help from target screening questions to identify. awesome potential. Hires fast visit lincoln dot com slash rico daily to get fifty dollars off your first job post that's linked in dot com slash recode daily for fifty dollars off your first job. Post terms and conditions apply..

Google gabriel ghaffari Hey tony ai gebru facebook gabor Gabriel tim abe Ai dina jeff amazon matt microsoft lincoln
"gebru" Discussed on This Week in Tech

This Week in Tech

09:03 min | 4 months ago

"gebru" Discussed on This Week in Tech

"You gotta it's gotta be cool. Don't you see sleep better when it's cool better you. Actually you fall asleep. Fine if you're warm you won't stay asleep really well. Body temperature drops. It's good to have it like between seventeen and nineteen and a lotta people. If find that it's really hot. They're not gonna be able to sleep or they sweat and then they're waking up in. Your blankets are off and on and they're really comfortable mattresses. We have one for my dog. Yeah so it came with a waffle came with a waffle. Nice little off chew toy. Anxiety videos There are videos on everything including Getting a better night's sleep right is asleep in here. Sleep says conflict shares the little baby one sleep only. That's the baby at the sleep you've always dreamt of. Oh yeah and the nice thing is that it's like our whenever we deal with people that are that have insomnia. They come in and they can't sleep we. These are the things that i would say in my session to to help them get their on tracking track and at the stuff that so we kind of you know sleep is important and we spent like you know thirty percent of our life on a mattress or depending on your sleep habits maybe more maybe levels all sleeping but yeah yeah i agree you well actually. That's the big problem for me. A lot of time spent on the mattress. I'm not sleeping. And i wish i were sleeping. So wide awake looking at the ceiling. no fun you and renee richie. Doing a great new podcast called talk. Wanna get everybody. Subscribe to that because brenes got the brains but you've got the heart. It's a lot of fun. We haven't done a podcast in a really long time together and so we're really excited. Can i give a little plug to my friend. Who just started a new service for people that want to kind of get live with technology but not be engulfed with technology. It's called navigate. It's christina krook with navigate. And it's this wonderful site. I think that it's it's navigate christina krook and It's just this wonderful site that you can subscribe. And she sends things to help. You live a more peaceful happy life and not be kind of sucked into technology but enjoy it along with still connecting and. I think that we really do need people to be able to connect with each other. So i'm really excited for her. So check that out. And you don't have to ditch your phone okay. no no. it's how you can live with both. You can still watch this. Podcast should also have a more full rich life with sympathetic again up in the morning and the first thing i have to do is go to my animal crossing island and make sure there's no weeds and find all the fossils and shake all the trees and then an hour later i get out of bed. I bring my laptop to the breakfast table. And i sit down and look for news for i. Don't i'm with in front of a screen all the free time so someone to get that for you subscribe balancing. Did you play it ever. I did play animal crossing. Yeah on my favorite thing was just getting the different animals does. That was everything. I could get sucked into that. I don't know what it is. It's the simpsons tapped out before that it was we rule before that was far. There's something about arming. It's so it's so peaceful today to us to want to hunt gather accumulate. Yeah yeah that's the people that did that. That are survived right. Anson grasshoppers you get. Make good money if you raise pumpkins in animal crossing so basically took over the whole island with pumpkins. it's not good for the residents teaches a little bit about capitalism. It makes you understand the machine Yeah screw the residents growing pumpkins right pretty soon. He'll by the pumpkins and then you'll be sorry all right. Let's talk. I wanna actually. This is a big story. And i'm sure that jason 'cause you cover media With a lot of the shows is something you've been kind of aware of. Let's go back in time. The trump administration fought it for a couple of years. But at and t. One two by warner media and boy. This was a big acquisition more than one hundred billion dollars and they finally got through The trump administration tried to slow it down. Eventually judge was convinced by a t and t that nothing bad will happen. So this is this is following. Comcast buying nbc universal. This i mean this has been what's happened is that viacom buying. Cbs as these channels the distributors of delivery companies the streamers by content up and. They're just all vying. Eventually everything will be owned by verizon. At and t. Comcast the big. Isp's so they bought warner brothers. They got hbo. They got turner. They got all sorts of things. Like dc comics Which by the way. The first thing they did was kill mad magazine. Are you gonna. It's such a bummer. Just makes me mad but such now Then the next thing that happens. Is that richard pepper. Who's in charge of. Hbo and actually considered widely considered to be a brilliant movie. Executive gets forced out by. At and t.'s Eventually became ceo. John stinky and i remember a couple of years ago when stank he said and i was i felt a little chill my spine. Hbo needs to be more like net flicks. That's in effect. What's happened yeah. I loved him in the little rascals by the way staying stinky. Here's my favorite the hair. That stuck up right. Yeah everybody is afraid of netflix's power right. Everybody entertainment entertainment industry. And they all want their own netflix's they won't want to be their own networks and everybody took a different path to it so like disney took their path to it with disney plus and what they wanted to do is warn media driven by. At and t. They wanted to create something where they could pour all of the resources into their own premium streaming service and so they they took. Hbo which was probably their best property. Honestly their best brand but they said we can't be. Hbo anymore you need to be net flicks and that led to an exodus of hbo executives. Out apple player. The plan is doing his stuff for apple tv. Plus yeah actually. I would say maybe the streaming service most like classic. Hbo and that they're focused on originals. And meanwhile hbo max now like they are trying to be net flicks and jason killer. Who is a guy who was the founder of hulu basically is running it and you know they're playing a different game and now with the stress whom were about to talk about. They've got some content. That's really interesting in may actually motivate people to sign up for the service which apparently so far most of the people who actually are eligible to get hbo. Max have just bothered to sign up for it. So in classic antitrust theory the way it works is a company becomes bigger and bigger pushes out all the competition and one way they do. It is by undercutting competition selling below cost so forth until the competition is gone. They control everything and then they raise prices Through the roof. I think this is exactly what's happening. Warner brothers has announced that starting in two thousand and twenty one. I don't. I don't think it's going to end. But they're gonna release all their new movies both on. Hbo max and in theaters they'll be on hbo. Max for a period of a month. They gave the theater. Companies one hour notice of this and distributors like amc and regal. Who are this close to going bankrupt anyway. Thanks to covid. Nineteen are freaked out. It is going to be a huge threat to the future of motion picture theaters. Yeah for sure for sure. It's starting christmas day right because it's with wonder woman wonder woman before and then going on and there'll be a one month window where those will be playing not just in theaters on an hbo. Max and then after a month it will go to online sales and rental links. Not for twenty bucks free on. Hbo max yes and if you by the way if you get hbo with your cable company you probably can get hbo maximum free. You just sign up. Yeah so so about a lot of people have if your movie theater.

christina krook trump administration renee richie brenes Hbo hbo warner media Comcast richard pepper insomnia John stinky Anson Warner brothers netflix viacom jason killer disney nbc
"gebru" Discussed on This Week in Tech

This Week in Tech

07:38 min | 4 months ago

"gebru" Discussed on This Week in Tech

"Email today. Barracuda's total email protection. Ninety one percent of all cyber-attacks start in email. That is the number one vector forgetting spearfish and getting account takeover and conversation hijacking and ransomware. We see the stories again and again You know steve gibson talks about it all the time you gotta know what's in your email and that's barracuda's so good at remember you've got now employees working at home where they don't they can't say hey. Does this look right as email. did you just send me an email boss They don't have the software protection while they don't if they don't have barracuda total email protection this The barracuda researchers since january of noticed a spike of hundred sixty seven percent in corona virus related. Spearfishing the spear fishers pose as the world health organization. They promise information they They give you download a pdf. Of course it's not that it's It's ransomware get the protection you need for your company with barracuda's total email protection. It's an all in one email security backup and archiving tool Gives you ai. Based protection from spearfishing account takeover and business email compromise. You need that because You know the bad guys are not sitting still. They're constantly refining their attacks. Unique protection that is constantly evolving to take care of that. You also get an automated incident response that helps you quickly and efficiently address a text. We know the faster you respond to an a malware attack the less damage it does. They'll even give you security awareness training for your employees because they are after all the first line of defense against attack. Now here's the key you can get a free email threat scan right now if your office. Three sixty five account and no risk. No pressure all you have to do is go to barracuda dot com slash twit. I know there's some bosses who said i don't want to know i just don't wanna know but really gotta know i hate. I know it's frustrating scary. It's bad out there. But you gotta know barracuda dot com slash twit uncover the threats hiding in your inbox and then after you get that secure free skin. I think you really ought to consider barracuda total email protection. Barracuda is your journey secured. Protect yourself with barracuda. So i apologize for the beeping. I furthermore apologize. Seth for choosing you the beeping. I'd never heard that sound before and i'm just. I'm just offended that. You accused me of living in brooklyn all right. Tell me i'm wrong. Doesn't he look like he belongs in brooklyn you could you look like you know you look says there were crickets there. Man crickets well. They're all there too polite polite home. We're thinking yeah brooklyn for sure. Yeah definitely So there is labor strife plenty at google. The r b is now accusing google of surveilling its employees and other labor violations. They investigated the firing of several employees in november a year ago. Not this past of ever and they say. The google violated parts of the national labor relations. Act by surveilling employees and generally interfering with restraining and coercing employees in the exercise of their rights guaranteed by the national labor relations. Act you have the right by the way. If you're my employees stop listening for second. You have the right to form. Join or assist a union or engage in other protected concerted activities. Don't you dear no. I don't care the complaint makes it clear. Workers have the right to speak to issues of not merely unionizing but ethical business And that's a big deal on the on the heels of that google fired perhaps the best known black expert on ethics and a ai. Timid gebru She was fired after she complained that her research had been suppressed by google. Now more than fifteen hundred researchers and twelve hundred google employees have signed a petition protesting her firing I this is always a challenge for me to way into these. This story from google is sh. She said 'i will resign if you continue to suppress this or at least not. Tell me why. You're suppressing this research and then google said so she resigned. She says not resign. I was fired. The dispute arose last month when a senior manager. Google told her that she would have either to retract or remove her name from a paper she had co authored by researchers inside and outside google saying and this is the reason by the way google hired her in the first place saying that technology companies could do more to ensure ai systems do not exacerbate historic gender bias She says i feel like we were censored. And i thought this had implications for all ethical. Ai research you're not going to have this was talking to wire. Do not going to have papers that make the company. Happy all the time and don't point out problems that's antithetical to what it means to be that kinda researcher. That's why she was brought in to help google with ethics. She's technical co-lead of google's ethical. Ai team so google. A lot of labor issues Right now suggests leo just to be clear. Legally if you say you're going to resign at some point in the future and then you're or you're thinking about quitting or something like that and they terminate you immediately. You got fired. She fired her and although her boss's boss's boss can say oh no we we accept her resignation. It's not true resignation. Yeah it's not a resignation. She was saying it was also like a. If this is true that you're going to do this. We're going to need to plan for me to leave because this is an unacceptable choice that you made and they're like you're fired And then they claim that it's a resignation but it's not also their defense of what they didn't like about. The paper is amazing. Because it's basically like it didn't have enough sources it which had had one hundred sources But it seems very clear. That what what google didn't like about it is. They wanted to insert some things that maybe softened it or made. It seem a little bit Easier they basically wanted to use some. Pr spin on an academic paper. Which is not. I think it really makes. Everybody should stop and say why are big. Companies like this Hiring people to do research like this because clearly they don't want independent research and independent thought. They want something that'll back. Whatever their corporate desire is. That's the message that this firing sense i think. And that's the more serious issue Because you know it means. She was a figurehead. They hired her to make it look like they cared but the minute she did anything that challenged the status quo. It was like sia..

google steve gibson Barracuda ai brooklyn barracuda gebru world health organization Seth Ai leo
"gebru" Discussed on This Week in Tech

This Week in Tech

08:42 min | 4 months ago

"gebru" Discussed on This Week in Tech

"Was he doing in shock. Anyway i thank you all for being here and for staying healthy and being well and A we i think more and more we're going to talk about over the next six months is how the world has changed for instance I don't think before covid. Nineteen slack was worth twenty. Seven point seven billion dollars. I think you could even say there may not be worth that now They lose money every quarter. They have revenue of two hundred thirty four million dollars which is up but not up nearly as much as teams and zoom. They're kind of the laggard in the co-founded thing. We surprised jason. When you saw that salesforce by i wasn't And i think the value. I mean i don't know about about billions on. What do i know. Twenty billion thousand eight. Yeah but you know. I think this is. I was excited about it. Because i do use slack. I use it a lot. I i've been out on my own. You know even pre covid garage for six years. The way i collaborate with people is mostly through slack and it. I know that they've been attacked by microsoft because microsoft has such leverage. If you're an office three sixty five you can just or three sixty. I always get those mixed up degree sixty five days of the year. Okay so three sixty five and a quarter technically but anyway microsoft you. Don't why spend money on slack if you if you already have that even if microsoft teams isn't as good it's good enough right eating slack handily in terms ups because this is part of the deal right. A lot of people aren't microsoft three sixty five. Why would you pay for slack under those circumstances. So i think what's good about. This is that maybe it frees slack up to focus more on their product. Because you're right. They have gotten lapped by. Not only microsoft caught up with them or at least gotten close enough with them. But you look at some of the other stuff that they do i i. I talked to people who've tried to do Audio conferences in slack. It's not very good. Like they kind of missed the zoom opportunity to be the place where everybody gathered for video in their workplaces. So you know for salesforce i. It's an interesting purchase but slack is important and i think it adds another Another chapter to salesforce a story talking to their clients and hopefully it frees the slack. People up to improve their product because that product does feel like it's sort of stalled and hasn't really done a whole lot in the last three or four years in a way. It's better to be a smaller money loser. Because no i mean if microsoft or somebody wanted zoom its market cap is one hundred fifteen billion. It's a little bit bigger of a chunk to bite off as much as twenty seven point. Seven billion Sounds interesting Article by casey newton in his new Website platformer remember. He left the verge to start this kind of like you did jason to start his own solo blogging venture He points out that slack was the poster child for worker centred work tools even more tools that the workers brought into the company used in small groups and then eventually it spread to the corporate culture. And and that's how slack got in the door because people loved it but but now we're kind of in a world where the company decides and the is a microsoft company so there were google company. So they're just gonna use the tool that they're already buying in effect and that's going to be hard for slack. One of the reasons slack had to sell to somebody with a salesforce. if you will I do think though that one thing about this. Acquisition of people say well the free tier of slack. And there's so many of us who are on free slack. Instances is going to go away. Salesforce is gonna to to make money and you know a lot of buyers of companies do stupid things with the business model of the company they buy and so they might do it. But i feel like that. Free tier is slacks. Best sales pitch right that slack is getting people to like slack and then go to their bosses and say you need to buy slack so i hope they keep that around But yeah i also imagine. This is an opportunity for salesforce to start selling slack in to all the organizations where salesforce is already playing. And they can if anybody can take it to microsoft. It's probably sales. They must feel that way otherwise they wouldn't have spent what is a considerable premium on what slacks Value was do. You don't use slack. Seth or to you. I don't know i. I mean i've i've been forced to for projects that i've worked on and whatever i mean i think that if you know if sales force winds up so that it works fine when you have more than four chrome tabs open. would be enormous and electron app. That's problem it's who we really is. i mean i like you know. I don't. i feel very weirdly about slack. Because i have enough distractions throughout my day. I don't really need to use it to communicate with a lot of people. I'm not in an office environment But i also feel like People communicate about things on slack that they feel very comfortable talking about in person which is great for for slack. It's amazing that they created a product. That people are doing that with. It's just really unfortunate that a lot of those things are going to be either worked sensitive or personally sensitive and that slack is probably not a really good platform for that. You don't want to be talking about super six information In secure is it considered secure a think it depends on which instance you're using and I last time. I looked i could be wrong about this. But the last i looked slack was not Encrypted which means that those messages Are simply you know available either to your employer Or possibly you know Available to be subpoenaed added if you're not in a environment so the there's all kinds of things and even think of that. Yeah yeah there's all kinds of things that i just. I personally am a little twitchy about. If you'll forgive the reference you want to you want to not commit to email or slack anything in writing that you don't wanna be subpoenaed because it lives forever and i didn't even. I didn't even consider that if he ever. Pgp key in email. That's fine bill. But but certainly i. I advise people who want to talk with me in a slack instance about something that sensitive that we go take the conversation to any other platform. I mean you know like even facebook messenger has its secret conversations which are intended scripted There's just so many more options out there right now and probably first slacks developers. They feel that there are so many other options. It's not really something that's worth worrying about. i'm just thinking about the most famous slack security flaw and it really wasn't slacks fault. It was social engineering of twitter. Engineers that that's how all of those twitter accounts got co opted including Joe biden's because apparently slack it rather twitter posted credentials for its god mode in slack. Which was a big mistake and but it was but i as i remember and i followed it since but i think it was social engineering right that. Got them into this slack. Commune yeah and they found the keys there but you know it's a real. I think that's a really important point of all the things that you must be doing in slack perhaps pinning super credentials. Not good is is is less brilliant. Can you do Shows how people don't really know what is safe. And what isn't safe and what they should put into water that people you know. They're your admin can read your. I don't think that people by and large are taught. Y you know this should be this. Media savvy should be taught in school so that people are thinking. Most people don't know what end to end. Clinton corruption is people. Don't know that when they send an email to someone that someone else might be able to read that and have full access to whatever's happening. And so i think that it's it's not just people at largest fault boy. You would sure hope that twitter engineers would have sunsets of oc sex sacked but georgia. Even they probably don't know how to assess the risks. I mean clearly pinning credentials in a slack channel is not safe but maybe they thought it was. Is that your experience..

microsoft salesforce casey newton jason Seth google twitter Joe biden facebook Clinton georgia
U.S. Labor Board accuses Google of spying on employees, discouraging worker organization, and retaliation

Daily Tech News Show

02:39 min | 4 months ago

U.S. Labor Board accuses Google of spying on employees, discouraging worker organization, and retaliation

"On wednesday. The us national label labor relations investigated the termination of several employees in november. Twenty nineteen as a result has issued a complaint alleging. Google violated the national labor relations act by surveilling employees and interfering restraining or coercing employees who tried to exercise rights under section seven. Google is accused of discouraging employees from forming joining or assisting a union this all centers around lawrence burland and catherine spires they filed a complaint with the rb claiming that they were fired for organizing around treatment of temporary vendor and contract workers as well as retaliation against workers protesting google's work with customs and border patrol in november twenty nine thousand nine berlin and rebecca rivers were placed on leave for allegedly sharon confidential documents not pertinent to their job a protest in support of the to lead. You rivers burland. Paul duke and sophie waldman being fired. The complaint will be evaluated by an administrative judge after which the nlrb will decide whether to prosecute google and pursue reinstatement and damages. But on the same day. Google fired it's co leader of ethical artificial intelligence. Tim knit gebru. Gebru sent an email that She said laid out to conditions. Those conditions have not been made known which if met would lead her taking her name off a paper if not she would work on a last date for employment gab. Bruce says in response one of her bosses a reports replied quote. We cannot agree to number one and number two is you are requesting. We request your decision to leave google as a result and we are accepting your resignation. However we believe the end of your employment should happen faster than your email reflects because certain aspects of the email you sent last night to non management employees in the brain group a behavior that is inconsistent with the expectations of a google manager unquote. That email was sent to google. Brain women and allies in email group for company researchers gabar studies bias and facial recognition among other things and is an alumni of the stanford artificial intelligence laboratory she worked on a landmark study in two thousand eighteen that showed facial recognition miss any dark skinned women thirty five percent of the time while working well for light skinned men

Google Lawrence Burland Catherine Spires Customs And Border Patrol Rebecca Rivers Burland Paul Duke Sophie Waldman Tim Knit Gebru Gebru Nlrb Berlin Sharon United States Bruce Stanford Artificial Intelligen
Big tech companies back away from selling facial recognition to police

The Vergecast

05:49 min | 11 months ago

Big tech companies back away from selling facial recognition to police

"Big companies are saying they're not gonNA make facial recognition technology so just the quick rundown IBM announced it's no longer offer, develop or research facial recognition technology. There's some important caveats that Amazon has banned the police from using its. Facial recognition system called recognition for the next year. Which is the other caveat? Internally. Microsoft, which is sort of talked about this? A lot has said we will not sell facial recognition to the police. Insult Congress Passes Privacy Law. Another caveat they're all caveated announcements, but. Just before we started recording, I was saying you do not see a technology at this stage of development. Halted in any way because of social concerns, so walk us through what's happening? Obviously in the context of black lives matter the protests, but walk us through what is going on anouncements and how they work? Yeah, I mean so I. Think something that I definitely want to highlight up. Top is the history of criticism of these systems which goes back years, and there's a few researchers particularly enjoyable Weenie Timmy Gebru who, in two thousand, eighteen published a paper called gender shades. which they really sort of provided the first comprehensive empirical evidence that there are racial and gender bias with these systems, so that's been alone of criticism that's been picked up. Push forward by people like the ACLU. Now obviously with protests across America and greater scrutiny on. Police and law enforcement in January that's now pushed out to the four, but these criticisms go back years, so it's really important that they're now. Being taught about an action is happening as you said, there's been three announcements IBM on Monday Amazon Wednesday, Marcus often Thursday, but each of them deserves different caveats. Amazon Microsoft announcements obviously goes together because they're going for a twelve month. Ban and IBM's goes in a different slot. Because they're saying we're going to stop doing this. Totally however I would say. I've been desperately trying to get IBM to answer some specific questions. About the scope of this band and they have been very unhelpful. You both will have experienced this before when you go. Okay, so I have a little question here, and just like a yes, or no answer. And how do you feel about giving that? Go? James We've written a blog post. Just like to read the blog posts. Donald explain everything and. Post but I had some questions about the blog post and they will not. Have you really read the? Anyway. For a company who who make software that claims to like understand the of human knowledge, and is able to like actually literally debate you. The inability to engage in a discussion about its products is fascinating. Maybe you're actually talking to their debate. Ai and not to real human. Oh, my God yeah, it turns out. Watson sucks. It only generates blog posts with ambiguous claims. And cannot answer a follow up so James Real quick just give us a brief intro. How does facial recognition work so facial recognition is based on machine learning. which is sort of technology that looks for patterns in large data sets, and then tries to predict or find those patterns elsewhere. In this case, the patterns it's looking for our measurements based on your face now there's lots of different ways that these algorithms work, but basically they're looking for say the distances between certain landmark features on your face that could be the distance between your eyes from your nostrils to the tip of your mouth between your eyebrows all. All those sorts of things so they will be scanning faces. They will be measuring those little points, and then they will be comparing them to a database. There's different ways that could happen. You might be doing a one, too many find where you have a face that you've say record on TV and you're looking for it in a big watch list of people, or it might be a one to one thing which is what happens at a passport. Border check where you have one picture one face, and you're just looking to see if there are match or not, those technologies are implemented in very various ways. You know some people use them to pull footage from TV that they analyze later. Some do live TV stuff that's happened in London for example and somewhat to integrate them onto body cameras as well so there's a lot of different ways is coming up, but you're basically getting an algorithm that looks at your face and looks to see if it matches another. Let's start at the top right. There is the gender shades paper. There is this just enormous body of evidence that. Facial recognition systems are biased against gender and race. Just unpack that a little bit, and what does that actually mean in practice for people? What should people understand about that? So that means that the same? System when it's looking at a white face when it's looking in the face of a person of color. It is going to be just less accurate when it comes to matching identities when it comes to just simply saying say what gender, it thinks that person is these algorithms consistently get lowest scores for non white faces. It's really as simple as that. The huge terrifying scary things is when you think about how those judgments then going to be used by people with power over your life, whether that's the law enforcement, or whether that's a private company. Many of whom are buying these systems and integrating them say into a watch list for the shop, and they are going to start going well system says your on our. Nautilus, and you can't come into the shop and you know what recourse you have in a scenario right, so I always think of this in terms of the TSA. A Brown man who regularly travels the bag, full batteries and wires I have many interactions with the TSA, but I'm in the airport.

IBM Microsoft Amazon James TSA Aclu Timmy Gebru Nautilus America Donald Trump AI Watson London Marcus
Is This the End of Facial Recognition?

Slate's If Then

06:34 min | 11 months ago

Is This the End of Facial Recognition?

"Back in the summer of two thousand seventeen before she her research on Algorithm. Bias, Deborah she was working at a company called, clarify a computer vision startup, and that's where I got introduced till machine learning, and Ai, and I kind of entered the research world, and I remember the first time I saw my first face data set. Noticing right away that there was a lack of diversity in representation in the data sets, and I was trying to have this conversation with people in my office, but also just more broadly. I was trying to say like. Hey, I think this is the problem, but the response is always like it's so hard to collect data. Why would we think about this extra dimension of representations like this is so hard to do what you're asking for is something that? Is so difficult and like this is the way it's done in. Everyone's accepted. That was sort of the response. I was getting. At the same time, the dead was noticing this lack of representation in the data set. Another computer scientists had noticed it, too. Hello I'm joy. Appoint of Code on a mission to stop an unseen force that's rising a force that I called the coded gays. My term for Algorithm make bias joy. Polin Weenie is a researcher at the MIT Media Lab. And back in two thousand sixteen. She gave a Ted talk about her work. Algorithm MIC bias like human bias results in unfairness. How? Are you like viruses can spread bias on a massive scale at a rapid pace, so yeah finding Joyce Ted Talk. Was this important moment of like all my gosh. There's another person that cares about this I reached out to her sort of exactly. Not Moment and ended up working with her after it's the project they worked on along with Timmy Gebru. is called gender shades. And we talk about how it's probably not a coincidence that her me to make a brute where like all black women were like I, don't think. We're we're all the people that notice this. At the time. The researchers knew that these facial recognition programs with limited data sets were being used by law enforcement agencies around the country. So inaccurate results could have real world consequences. Can you describe? The work you did and what it showed. JANNASCHII is a blocks audit of commercial AI products, so these are tools that companies today, so and clients currently use so nothing that was audited. Experimental everything was in the wild in us. This is mass market stuff. Yeah, exactly what if he just tested these products on a benchmark that was representative with respect to gender and race? What would happen? What would we discover and what we discovered was that when you test these products on darker females, it performs thirty percent worse than it did on on ladder males. And that was the really big discovery was that these products were not actually things that worked well for everybody that they were saying it to, and that coupled with the reality that they were selling this technology or pitching this technology to ice at the time on two different intelligence agencies to local police departments was really alarming. It was something that demonstrated the fact that. Facial recognition at this point is disproportionately being used to sort of monitor and severe minority communities. That's part of this law enforcement pitch, but also not performing as well on those two meetings, which is obviously scary, alarming safety risk. What do these companies do with the police? What are these contracts for? In some cases, it's something reasonable such as attempting to shortlist a group of suspects for crime, so they might have security footage, and a budget faces in the security footage and try to identify who was in their mugshot. Database fits or lines with the faces that they see in the video, so that's a lot of what they do. Is this idea of face verification? Verification or matching a base that I have in my data set to face I know is from a suspect in a crime, and if I have a huge data, said how do I do that quickly and efficiently, so in the case of shortlisting suspects, it feels not as bad, but in a lot of cases they might also use it on sketch photos where. Think don't have a picture of the suspect. They'll ask you know victims to describe it to someone that will sketch it out, and then they'll. They'll put in the sketched photo and use that to search through the list of their database of mugshots. Yeah, and you can imagine just how many false arrest happened as a result of that. Even though there was research from people like deb that showed major flaws with the technology. All the big players in the field kept their products on the market the continued until Monday. When IBM announced, they were stopping their program entirely. So this week. IBM says they're no longer are going to offer or developed facial recognition technology, and I wonder as someone who is immersed in this and studies it. If you look at this announcement differently than a regular person reading the headlines. So since we've been watching these companies effectively for a while, I know a lot more of the backstory leading up to that announcement. It's not a spontaneous decision. I don't think it's as bold as IBM. said out to work right now so right now it's Berber. Are just kind of like wow IBM abandoned all of these important big contracts, and just spontaneously made his decision, and because it's happening at a moment of high racial tension, the mistakes, but also just a lot of reckoning with respect to the racial history of states, it seems as if like Oh, ibm you know, had this realization in light of protests and everything that's happening but the reality is about you've been working towards this position for a long time, and this is the most financially beneficial position for them to take at this moment. How so so IBM was called in gender shades, and they were quick to respond within four. Four months they had released a new product in response to the the revelation that there was this huge disparity in the performance on different demographics. Some tried this idea of. Let's build a big data set to fix it. Following that exposed because they had to use flicker images without any consent in order to collect that many faces, and ended up sort of being this embarrassing situation where the conversation around privacy consent was completely neglected

IBM Joyce Ted Talk Researcher Deborah AI Mit Media Lab Timmy Gebru. Representative Ibm. DEB
"gebru" Discussed on This Week in Machine Learning & AI

This Week in Machine Learning & AI

07:44 min | 1 year ago

"gebru" Discussed on This Week in Machine Learning & AI

"There's been all sorts of legislator pass around face recognition right and that for the US pretty fast from like writing gender shades to like this kind of and and of course many people like the at the Center for Security of what is privacy security. George Tron Law Call Laura Claire Garvey Alvarado realities people. They do the most amazing work online tracking the use of face recognition or really automated facial analysis in the US by law enforcement. Whether it's ice whether it's a other types of law enforcement and they had the perpetual lineup report was the the first one America Andrew Watch was the second one so important you know in conjunction with like a lot of people's works. I feel like that has resulted in some amount of change inch in legislation within the last year right and I think we're going to start to see more of that Another thing I've seen actually recently. Is that so much more so many more people in civil civil society are part of this conversation now like the ACLU and many of many other people are part of this conversation about about whether some sort of technology exists or not and like a governess and things like that. And I think that'll continue to grow one thing. I'm really worried about an I. I am predicting this will happen at Started tapping already is once again. The taking over of marginalized voices right on in this space so like I said like people work really hard to make something fain and then once it's a thing they're the people who didn't suffer the consequences of trying to make it a thing kind of become the faces in the heads and the and the people who steer the ship and at that point it starts going in the wrong direction because it's not really trying to address the issues the people who took the risks to make thing because they took that risk because what they really care about is addressing the issues themselves right and so that's one thing I think will happen and increasingly seeing that already. Is there absolutely. I mean it's hard to give specific that I will say like a lot of institutions like you know for example San Francisco Actually Stanford h-share was started want partly by my advisor right by the MIT. It's a so there was a big announcement saying like there's going to be a school of computing and there's going to be a focus on ethics and things like that right but like literally if you can't even have like two black professors like you know what I'm that's what I'm talking about so I think that kind of stuff to be very hard to tackle absence on the part of Tabei H I was actually also talking about my but exit exist so with. Hi is true right. And I believe they're like I talk to my adviser about it too. Unlike I think now they're starting to be a little bit more communism about it but I mean as an institution you're going to have money from like different groups of people in those groups of people are going to be if you have the name and if you're not really from that much of a margin is gonNA be more likely to raise this money and stuff like that right whereas a lot of other other groups of people that are from our communities in smaller institutions are gonNA struggle to get credibility in and raise money but like. Yeah what what I was seeing was for example like They're announcing this huge endowment. And this you know ethics and all all of this stuff but when I look got their faculty just in general like in computer science or engineering in Lake they may have one black person. I don't know but you know what I mean. So it's it's really hard to to think I mean Stanford I believe. I used to think that there were zero black day. They graduated zero black people all with a computer. Science Ph D.. They've never graduated black people with a computer science PH D. I believe there's there was maybe one person ever Cassandra's history that's graduated with a PhD in computer science right on lake. You know and so it's just impossible to have you in you know ethics things without addressing these kinds of issues right like if you don't interact you're not you know. There is something wrong in that system that is not allowing these groups of people to thrive and and to even be present so so this distinction between quote unquote ethics six and like In in theory than in practice in the real world. I've seen more of it on the last year. But I've seen more of of it being discussed in the last year but I don't anticipate more of it happening like in the future the next year. Yeah there's I don't know that I wouldn't necessarily say it's a shift in the conversation but there's definitely something happening in. The Congress may be attention. The conversation about the extent to which ethics and fairness in these conversations should be grounded in self interest grounded in and protecting the most You know marginalize at risk You know you know who and why should this be about is are using anything there. Yeah and I'm seeing I think bifurcation and the community and and that could happen in the future right so there's a camp that says and I'm probably in that camp that says you know I just don't like the separation between Oh yeah like there's this theory that we're doing the theoretical work which is the math and the proof since staff and then there's like all this activism activism that's happening you know when like the diversity and inclusion and labour organizing and that kind of stuff and like those should be separate right like so if we are having having a theory thing about fairness for with thirty people on there none of them are black. That's okay because we're doing this theory people and I would just completely disagree with that right nowadays. That's expletive 'cause then that's explaining a particular community that you're talking about in your papers To to kind of advance. Vance your career and this is not helping the community at all right. And then there's a campus we have to be much more interdisciplinary we have have to have less boundaries between disciplines. It's like you know diversity and inclusion work and all of this lever organizing stuff is just a part of this. I'm very very much in that camp and many people in my and our team are in that camp and I think that's why they gravitate to our team because our team is one of the few homes for for people who believe that and so like you know and so going back to where I started because they came from Emily's talk it's that idea of the view from nowhere so if you say that you know it's okay. We're only doing fairness related theory work and there's thirty people here and that's fine and that's a tearing that what is the view from nowhere. That's assuming that that view of those thirty people is not affecting your theory. It's a theory that you think is GonNa help this particular view you that these Thirty people are seeking independent of the thirty people themselves. Yeah so that's the assumption right. This view from nowhere assumption which feminists have critiqued for cry long time. And so I'm in that camp two-minute thanks so much for taking some time to catch us up on kind of your view of the ethics fairness and his landscape. Certainly we could not do justice to this conversation in just under an hour but it's something that will continue to explore on the the podcast and looking forward to our next conversation with you. Thank you for having me. Thank you.

US Stanford ACLU Laura Claire Garvey Alvarado George Tron Center for Security America Andrew Watch MIT San Francisco advisor Congress Vance lake Emily Cassandra
"gebru" Discussed on This Week in Machine Learning & AI

This Week in Machine Learning & AI

13:51 min | 1 year ago

"gebru" Discussed on This Week in Machine Learning & AI

"Do you see any salient changes over the past year in the way fairness is approached or advanced from a kind of practical industry perspective. Yes definitely so. I guess I'll do a plug here on one example will man. Yeah I mean I didn't mean to do it like about one. Example is model cars for model model reporting which Meg Mitchell and so so many people in our team and across collaborations of worked on right and so a bunch of US had been related to the data Pittsburgh datasheets. Yeah Yeah exactly two models basically yeah and so a bunch of like you know I realize when when I was thinking about does your data says Hannah Jenin. And how Jamie and Briana and all these people and Jalan Kate Kate Crawford where we're thinking about it mm-hmm than Emily Bender and Bathrobe Friedman also how to have this paper data statements for NLP so that was like two years. It's it's really interesting. We were independently sort of like thinking about this at around the same time kind. Of course we're talking to each other. We're inspired by what. What what we each is the other person says and a couple of years ago it was like hey we think we think we should do? We think you know Shabtai she's do sets. And here's why you have to justify here's why you'd think we and then around the same time like before I arrived at Google there were there. Were also working on. You know disaggregated testing and thinking about how they could apply this to models in some way back and now fast forward just down like just a like take a break before thanksgiving a Google announced model cars from our reporting as part of their cloudy. I like explain ability to go kit and like you know you can go see example models. I mean this is just v. one kind of also to help gather feedback from people to see like what works what doesn't work raw but this is actually a thing for a real little and folks on ethics team have worked on. That is now part of Google product that people you can just wipe before we were trying to convince everybody. This is the thing that should happen. Now it's a start right like so for Real model real. Google Mollis not a toy. One right that is being sold to people you can see you have a model car more it right and and it was was. It's not a piece of cake to get an institution to do this right like it's not and it is like requires so much hard work. Unlike Meghan Parker and Andrew mean so many people it was like it's required so many people from different with different expertise in different organisations. And so. That's one thing I'm seeing right and I'm seeing actually conversations around data custodians of just like higher level concepts conversations. I believe they use having all of these conversations one they. I've seen it. I think is that so many organizations are talking about having ethics principles or guidelines. And things like this but then the the question is like how do they get enforced right like that's one shift I've seen in the last year right like there's a lot of air principles and guidelines and all that stuff and then the next kind of your you're to whatever is like how do they get enforce. How do we how do people? How do people believe that they will be enforced We wrote recently like with the Deborah Jane Andrews Martin in many other people. We wrote a paper trying to kind of think about this but like but I think that's the next so I guess I was is doing a retrospective what I want in the future. Kind of thing Lake Lak- so yeah so I think that like model cars being real for me is a huge thing and and I don't know I think you know there's many the partnership has this about Mel Project now and it's also donald that's about mel is like the accurate and so they have this like it's it's based on data. She's a model cards in fact she's from. I'm from IBM and they're trying to see how this could actually be right and it's you know it's funny like what would we write papers and stuff. We this should happen. But then when you implemented it's also like like how many stakeholders are there like what's the right granularity of of documentation for whom and in what format. It's so much stuff right so but it actually but the fact that it's actually starting to happen is is is a is a pretty big deal for me. There was A. I don't know if it was a workshop. I imagine it was a workshop that I I saw. I only saw the title of this on one of the electronic signs in the Conference Center. Here in Europe's that was intriguing. It was minding the gap ethics in fairness what that was about. I see a lot of people actually for my team. Alex Hanna and Vinod for my team are speaking at the other panel. There meredith breath is was there. I think I I know I know about this workshop. I wasn't involved like so basically Maybe next year this will change but when I come to nearest I will. I don't get involved in any workshops I don't I don't give any talks. I don't because basically black and then so really hard to do anything else so oh I cherish whatever session I have to chair I go there. You know but like it's hard to take on anything additional but yeah but this workshop. I know I've I've heard about it because of this but it's actually good that you know they house people like mayor they have another person I believe who was one of the the people who just got fired before Thanksgiving for I don't know I mean one of what. He was one of the people who organized who are organizing organizing against Google's relationship with I guess ice right And so it's hard rightly when you're working at an institution like there's their policy see and then there's you as an individual working ambi like well like I'm an immigrant. I mean I was a refugee and so how can I talk. How can I work in ethics? And whatever and I don't care if you have a model card for the model that you're giving them right. These are the leaders right. And and you have you know Like a relationship with him so he was. You know one of the people who was working on this. I'm organizing against this. Who who was fired right before Thanksgiving and has you know? Married his Co. organized. A walk out all of this anti even activism and. She's huge obviously may ethics Canadian. She's no longer there right so and it makes me wonder like if days are numbered or not. I hope not. I hope that you know model. Car uh-huh yeah. What are you seeing happening on the commercial side of things like I'm hearing more and more like IBM for examples? Also you know got fairness three sixty and I forget what the other. They've got several of these kind of ethics fairness focused things they're start ups at are focusing on explain ability which kind of plays into this How do you think about the you? Know the Commercial Mersal space or even open source like the the set of activities going towards making this more tangible and accessible to practitioners. Yeah like I think you know again like a lot of companies are coming out with toolkits and later. Ibm One is a good one. Google like is integrating a lot of things enter entertains Like there's fairness indicators for example there is there's also A lot of educational material that actually like Andrews Avatar on our team in really like worked so hard on various coal apps which you know like and Other educational material. That's out so like for anybody who wants to kind kind of learn a little bit more about what things to watch out for etc.. There's stuff happening. Let's see there's also you know when you're on the topic of IBM so again. There's this brings up the complexity right so they also came out with a data set right the diversity and faces data set but then there's also other things that show like the complexity Lexi having this kind of data set right so people are talking about like the fact. That data set was scraped through flicker. Then you know even though you have creative Commons licensing. Like when people put out their pictures on flicker they didn't know like ten years later. That was going to use for this kind of thing and so I know Oh. There's some cases of people sewing specifically IBM but also other academic instances. I mean this is the kind of everyone can mutate immunity as data right like And so so that's also other stuff that's happening and then I've facets and other kinds of tools to visualize your facets you should look it up. It's great it's it's a tool kit that came from pair we actually use it. I'm joy and I used it for gender. She's I mean by joy and I enjoy and DEB like they worked on on the website. and IT'S A. It's a visualization tool help to visualize your data sets and all sorts of other things so it's a lot of companies and are coming Out With these kinds of toolkits to help you analyze things and then I don't remember but I do believe there is a whole bunch of people working on like python libraries raise for for for some of these things but Another side of the story. Is that some people again are cautioning against you know. Having these toolkits is judge should be seen as a way to explore. What's happening in your model? Indeed right it shouldn't be seen as a check a check mark right and so so one thing one thing that has happened with the proliferation of a lot of fairness related. Works I believe is this. Were you know de biasing acts acts or fixing door you know and so that gives delusion that you do X Y and Z. Then it's fixed right and so inherently there this. This is a very complex likes problem. It's context dependent. You know it's domain knowledge dependent and so I think like sometimes I worry about how how to to get that across cross even when when all of these tool kits and documentations and things are around right. It shouldn't be hurt tension between kind of democratization and raising the level of abstraction and and providing tools. And you know what we call leak leaky abstractions like the fundamental complexity of the thing that you're trying to make more accessible. Yeah and I would even say like you know I. Anna Howard and many other people have done work on like automation bias. Right like people trusting being automated tools more and like more than more than sometimes right like people do work in trestle so if I give you a bunch of tools and do a bunch of tests like it shouldn't be like okay. I detest. Why is he in its past right? There's also like a framework that needs to be developed around how to use those tools and at which point in the product development process. They're appropriate right. It just shouldn't be seen as like this is like a small check x winding down past. And Yeah I'm curious what you had to kind of. Make some predictions of the field. kind of our contention foreward. What do you think we accomplish in twenty twenty or since we're talking about round numbers next decade for that matter? I definitely can't do about the next decade I do think we're we're probably moving towards a lot of discussions around standards such as like the ones that model car sheets and stuff and like how those fit into governance of these kinds of like we said like many any organizations have a principles things like. How are they enforced right so and also but then again like how do we ease the burden on people who don't have such resources resources So I wrote a paper recently with on Soon Joel. WHO's a history student? And so we were talking about like lessons that we can learn from From archival history and their data collection processes and they have data consortium where they pull resources together. And if one library doesn't have this kind of collection another library can use it can Can you know they can use another one library's collection and so with this increased. You know burden of documentation. which in the end we believe is necessary and and checks in you know like GDP are and all the stuff like how do smaller institutions and nonprofits and those without much resources? How can they also not be left behind? Because I I believe that's also a fairness question right if you if you are doing things that more larger institutions than the ones that benefit. That's that's a fairness question because that constrains who gets access to what resources and so I think I believe there were like conversations at the governance level around data consortia Conversations around around standards. So I I'm wondering if maybe within the next year we might start to see some organize it like maybe the EU or maybe maybe some other organizations starting to think more about governance and whether some of these things that we've worked as researchers like datasheets sheets a model cars and things like that will be part of that governance structure and because you know for example when when there's oh I forgot a the big thing and the last year you know..

Google IBM Pittsburgh Hannah Jenin Meg Mitchell Jalan Kate Kate Crawford Europe Deborah Jane Andrews Martin Shabtai EU Lake Lak Jamie Emily Bender Conference Center Meghan Parker Briana Alex Hanna Friedman
"gebru" Discussed on This Week in Machine Learning & AI

This Week in Machine Learning & AI

13:45 min | 1 year ago

"gebru" Discussed on This Week in Machine Learning & AI

"I had to attend her talk right before he gave a talk and Jamila. Smith's LAUDE and Andy Smart just wrote a paper about critical a critical race theory approach for fairness. That's right so so for example the fact that so joya now we're talking about how in in our paper gender shades. How wease uh a social construct right? It's it's unstable crossing myspace etcetera etcetera in this paper they were they. Were talking about how you have to really engaged with critical race theory methods. And how you have to you know you ha raises again and a social construct that sometimes. Maybe that's not what we WANNA use for. annotating data sets Same with the gender and but then there's a tension between you know if you're annotating for let's say Gender right you you need to make sure that you are not further harming communities by making a binary three by a further adding like a data additional data that needs to be added a from groups of people. So that's like additional privacy. Seve risk making sure. I think I forgot this paper that talks about like the burden on the minority groups. When you're when you're trying to get more data from them but at the same time in order to equalize parody across subgroups or even not even equalised barry but like just to have Perry but just to test out about how well something is doing across subgroups? You have to define the subgroups what they are. Were what those boundaries are and you have to then go and Gather additional data sets for that. So what problems occur when you're thinking about that process itself now. I'm seeing a lot more discussion on that. Ah On that process itself. Ucla me. When I first started working on it it was like even the notion that you have to injure intersectional testing so so you don't just test if my model's doing well on everybody what does well mean so? There's a lot of papers on defining notions of fairness ernest. There was and then we had to injuries of the concept of. Hey you have to do this aggregated testing which means you know. Don't just say as my model during wildfire like women. Women is is just doing well for like you know is doing welfare like this group of people this race that race you have to say. Is it doing well for darker. Skinned women lighter skinned Women mandate that that concept was a concept that was like kind of entries right. I mean we're just saying like how important it is to do that to break it down like that now. The question is attempt to make sure people understand. Hey that's really not the only thing you don't don't assume that every single will question about fairness is whether the the performance is equal across different suppers. Right it's about. Who has the data who doesn't how it's being used whether a task should exist or not etc etc and then even in cases where you have to define subgroups dessy the models else performance on those subgroups you have to like creating? Those centers is not a piece of cake right like the complexities that arise when you define mindset subgroups. Who's doing the defining the taxonomy that you use so like I'm seeing much more nuanced Like you know discussions around so this paper that I was discussing being critical race theory method for fairness I always. I'm so bad at these. Titles is yeah. Yeah it's it's those those kinds of things and that's what I'm excited especially just like the understanding the complexity of what we're talking about right it's not just about defining what is fair and so you have a mathematical medical definition for it. It's also like how your model interacts with the society that you're in. What kind of documentation you have available how you take feedback from that? And I'm starting to see that conversation right now right so thanks driving kind of increased appreciation of the nuances. Here is the broadening of the folks that are in the field or or just absolutely I think broadening if the folks that are in their field I think that for example more more of us more people finding each other's voices and and kind of kind of amplifying each other's voices so for example. Emily Denton is very well known for generative models. That's what this community addity knows her for right like the nearest community and everything so she was invited to give a talk on this workshop retro retrospectives so it's basically do you know about this. You write Outta. You're you're supposed to talk about a paper or something that you wrote. Unlike what would you. What would you say now kind of thing? And she did a retrospective on computer vision as that field and what she was discussing. All of these things. I'm telling you about the view from nowhere. She was talking about this concept of the view from nowhere. And how this this has been critiqued by amnesties for example and so the view from nowhere means it's basically saying that scientists assume that the view the subject you know sizes objective. You're finding these trying to find this objective truth so it's not from anyone's point of view it's stats like there is objective truth end. It's a view from nowhere so instead it's it's called the view from nowhere so she was describing how this the underlying lying foundation of this feeling that our work is the view from nowhere. And how that's kind of driving a lot of these and power dynamics and how that's that's robbing a lot of these issues were discussing. So for example she was naming certain data sets even that we don't question rate and like one example Celeb- data set which is A. which is this data faces that is very widely using computer vision and so one example? I WanNa give you is even when people are thinking about fairness they would use celeb- a so celebrities. This data said that has forty facial attributes. That are kind of annotated and so let's say it's like smiling versus not the young versus not attractive versus not you know and so when people are using this data set to be with us this data set to train all sorts of models and especially in the Gan literature and when when they're even writing about fairness many people would use this data set and they would just say. Oh Oh like. We're trying to make sure that attractiveness label is not dependent on gender or something like I dunno so it's this kind of thing and then what we but also you have to question Kevin. How whether we should have attract a data set with an attractive Nestle when the first who annotated this data set you know and where where is this kind of model going to be used by whom and so that was kind of staff that emily was talking about and data annotation practices data collection practices? So we at least there. There's a long long way to go to even have people listen to this conversation at why what I'm saying is Yes for sure I was called an act like I was given all sorts. There's a few bet for having this kind of talk but I'm saying that like at first we had to just be like. Hey you have to pay attention to this thing you know and you you. We couldn't be too super newest about it. And now we're saying hey like when you're now thinking fairness just don't think just equalizing metrics some metric across up Grigson publishing a paper. Here are all the newest. It's a system and here is how we should think about it as system. And so that's that's really the biggest thing I've seen. And how when I say a system and like how it's really you know labor organizing that meredith especially does right and how that feeds into new for example leader organizing for contract workers who are the ones annotate. These data says that we're talking about right what there is how they're how they're treated etc etc CETERA. What the economic incentives are and how that drives some of our decisions and I and then how that feeds it so like we're starting to look at it as a or more warm more people are starting to look at it as a system But at the same time because fairness is also a now like a much more. I would say it's become like more of a mainstream Thing like for CPR for the computer vision call for papers. I saw like an actual explicit like bullet point for fairness accountability transparency and ethics and computer `puter vision Whereas last year I co organized the First Workshop Fairness County trespassing ethics and computer vision right and so is starting to be like this a mainstream the thing in in all of these different conferences shift that I'm seeing and I'm wondering if it's new or you're you know or you're seeing it also or is it interesting as like if feels like you know prior conversations about fairness and ethics or at least a lot of them that I've been exposed to have been around like you know you just what you've been describing tools to analyze the predictions. You're making or your data set for bias things like that that and I'm hearing more conversations that are asking more fundamental questions like should we be doing exit all exactly so I think I that is that conversation seeing more now in the beginning it was more like. Hey there's this thing called fairness or oh that's up paper. This thing called fairness every survey very. Yeah that was really really love. Yeah Yup so so yeah. So there's you know. In in the beginning I think many many people especially in the theoretic more people in theory I would say theoretical computer science. We're working on on fairness as just a sub southfield and I would say they. A lot of people were not engaging with critical race theory or feminists. There anything like that for some feminist works or anything like that a now. There are a few people who are engaged in both like bridges between those two communities and then there are some people who are still in one community or the other community. But I believe that the conversation right now is almost trying to address the fairness S. community itself. So initially there we were. I think people were just trying to have a community at all and now like I don't think the justification needs needs to be made that people at near absurd CPR. These other conferences should work on fairness Ben. Now the conversation more hailing when you think of fairness like don't just go oh you know take a data set. That's already problematic. Think about how it's problematic. I'm have you know what kinds of tools can you have to analyze. Why something as a problem in the first place right the view you from nowhere critical race theory like another conversation? We've been having a lot about global notions of fairness a lot of fairness work. It because as I guess being driven by North American people in general people who are who might not be from your even but like from the. US Perspective spoke to though yeah from the US or from Canada Mexico. But I wouldn't say Mexico's very highly represented in this work Have a certain perspective at a lot of it is grounded in like let's say civil rights are US lar- something right and so what about other processes what what transfer is doesn't transferring no L. desire some of the conversations that people are starting to have. So that's that's sort of what I'm excited about. I'm excited about how you know. How like more awareness S. of how you can't just sit in a corner rate equations and Do Fairness I. I'm really I I just I honestly believe for me that I it. It really always comes back to what people do in international development as well right there is a term called the I believe the a seductive reductionism other people's problems right. It's like okay. So when you're walking down the street and and in Your Own neighborhood you St homelessness you realize how complex it is to work on home listening you know that when you're watching TV and like some person is holding an in African baby somewhere. You know what I mean. There's always these chose right. You might be like Oh. Let me go saw that because you don't understand the complexity right you're not thinking that maybe there are a lot of people people who have ideas on how to fix their own problems and they don't have visibility they don't have toolkits right. And so how can you uplift them. And one thing I worry about. The theoretical fairness community is like for example. There was just a fairness workshop. That was held at the Simon's institute. I believe it was not a single BLACKFORD person. You know what I'm saying. You have fifteen sixteen people. It's like a close thing in l.. And they're they're doing research. Mathematical medical researcher unfairness. Their papers are getting published. They're getting tenure talking about black people and there's not a single black scholar there I think that's expletive. And then they will. They would probably say well like we're looking for just the specific expertise and while there is no black people with this specific but then part of your job if you do really really care about fairness whether it's fairy or whether something else is to make sure there are people in that community who can also get the scientific credit for this work right like get and so that's really really for me. What's missing in this community and the thing I worry about the.

Emily Denton joya Jamila barry Andy Smart Smith Perry US Ucla Grigson researcher amnesties Mexico Simon's institute Nestle Kevin meredith Gan literature
"gebru" Discussed on Good Code

Good Code

02:39 min | 2 years ago

"gebru" Discussed on Good Code

"<SpeakerChange> Yeah. <Speech_Female> This is a really <Speech_Female> good question. <Speech_Female> So I'm in a <Speech_Female> disclaimer is that <Speech_Female> I'm not good <Speech_Female> at predicting <Speech_Female> the future. So <Speech_Female> I can't <Speech_Female> like say <Speech_Female> much on the future <Speech_Female> bet. But I <Speech_Female> think like in <Speech_Female> each domain <Speech_Female> like you know, if you look <Speech_Female> at hiring or fuel <Speech_Female> again, criminal <Speech_Female> Justice or <Speech_Female> if you look at other domains, <Speech_Female> it's really important <Speech_Female> to talk to the <Speech_Female> people in work with the people <Speech_Female> who are in that <Speech_Female> domain, for example, <Speech_Female> in criminal Justice. <Speech_Female> People who <Speech_Female> who've been <Speech_Female> at lawyers for <Speech_Female> death row inmates for <Speech_Female> years, or people <Speech_Female> who are <Speech_Female> public defenders <Speech_Female> people who know this system, <Speech_Female> people who, who <Speech_Female> have a <Speech_Female> lot of <SpeakerChange> expertise, <Speech_Female> because they <Speech_Female> because <Speech_Female> right now, <Speech_Female> mostly computer <Speech_Female> people, <Speech_Female> <SpeakerChange> there are <Speech_Female> a lot of people who are doing <Speech_Female> interdisciplinary work, <Speech_Female> but I'm cautioning <Speech_Female> against, you <Speech_Female> know, <SpeakerChange> a working <Speech_Female> to make <Speech_Female> the existing <Speech_Female> just the existing <Speech_Female> systems warfare <Speech_Female> without <Speech_Female> thinking about whether <Speech_Female> such systems should <Speech_Female> exist in the first <Speech_Female> place. And so, <Speech_Female> but so working with <Speech_Female> these people were <Speech_Female> domain experts, they <Speech_Female> can <Advertisement> give some <Speech_Female> <Advertisement> advice <Speech_Female> on what <Speech_Female> kinds of useful <Speech_Female> machine learning <Speech_Female> systems. We can make <Speech_Female> for this specific <Speech_Female> case, you know, <Speech_Female> Finally, <SpeakerChange> I'd like to <Speech_Female> ask you like <Speech_Female> to end by <Speech_Female> asking people <Speech_Female> if they're up to mystic <Speech_Female> about their field. So <Speech_Female> are you up to <Speech_Female> mystic about the <Speech_Female> what the future holds <Speech_Female> in terms of <Speech_Female> accountability <Speech_Female> fairness <Speech_Female> Kenny, <Speech_Female> I actually help <Speech_Female> us spilled affairs <Speech_Female> society <Speech_Female> to <SpeakerChange> me. It all <Speech_Female> depends on people. <Speech_Female> The question <Speech_Female> is are my <Speech_Female> optimistic about <Speech_Female> people? It's a tool <Speech_Female> as <Speech_Female> a tool. It's not <Speech_Female> gonna magically destroy <Speech_Female> everybody and <Speech_Female> it's not gonna magically <Speech_Female> solve anything. <Speech_Female> So it all <Speech_Female> depends on <Speech_Female> <SpeakerChange> who is <Speech_Female> using it for what, <Speech_Female> <Silence> what are we willing <Speech_Female> to use it for <Speech_Female> <SpeakerChange> are? We <Speech_Female> empowering a lot of <Speech_Female> different people to have <Speech_Female> knowledge and <Speech_Female> abilities <Speech_Female> to use this tool <Speech_Female> for their <Speech_Female> own purposes. <Speech_Female> I want <Speech_Female> to work towards <Speech_Female> that goal <SpeakerChange> <Speech_Female> in a lot of industries <Speech_Female> went through <Speech_Female> the same <Speech_Female> <SpeakerChange> allot. <Speech_Female> <Speech_Female> I always give the <Speech_Female> example <SpeakerChange> of <Speech_Female> automobiles <Advertisement> where <Speech_Female> people were, <Speech_Female> you know, there were court <Speech_Female> opinions on whether <Speech_Female> or not the automobile <Advertisement> was <Speech_Female> inherently evil. <Speech_Female> <Speech_Female> There were so many accidents <Speech_Female> people do wanna use <Speech_Female> seatbelts. There were <Speech_Female> no legislation, etc. <Speech_Female> <Advertisement> So a lot <Speech_Female> of different industries <Speech_Female> went through this transformation. <Speech_Female> <Speech_Female> And it's, it's <Speech_Female> always very good to <Speech_Female> be self critical <Speech_Female> and reflective <Speech_Female> but not be too <Speech_Female> inward. And also try <Speech_Female> to learn from other <Speech_Female> industries in other disciplines, <Speech_Female> <SpeakerChange> and it gives us <Speech_Female> hope, right? Yeah. <Speech_Female> We, <Speech_Female> we belts. <Speech_Female> Now, you know, <Speech_Female> yes. So yes, <Speech_Female> it does give <SpeakerChange> <Advertisement> hope timid <Speech_Female> give thank you <Speech_Female> so much for your time. <Speech_Female> Thank you for having <Speech_Female> me, so much. <Speech_Female> Great conversation. <Speech_Music_Female> Thank you. <SpeakerChange> <Speech_Music_Female> <Speech_Music_Female> That <Speech_Music_Female> was good

"gebru" Discussed on Good Code

Good Code

01:55 min | 2 years ago

"gebru" Discussed on Good Code

"As tool, it's not going to magically destroy everybody and it's not going to magically saw anything. So it all depends on who is using it for what, what are we willing to use it for are? We empowering a lot of different people to have knowledge and abilities to use this tool for their own purposes. Welcome to good code a weekly podcast on ethics in our digital world. My name is she in Lebanon. I'm visiting journalist at Cornel tax digital life initiative, and I'm your host, Yep. Still French into episode. We speak about artificial intelligence more specifically about the limits of AI. You don't really understand how I works. Don't worry, you're not alone. Basically, when an algorithm is at work. We know that it's gonna find stuff where we don't know what. And we don't really know how that brilliant finish is not from me. It's from a French TV show called the bureau that follows the adventures of analysts and field officers at the French equivalent of the CIA, and it perfectly.

Cornel CIA Lebanon