Weigel, Government discussed on Commonwealth Club

Automatic TRANSCRIPT

Weigel now back to our program it seems is that if you know thinking about sort of where this work touches the humana where it intersects with our lives there like different levels at which to think about intervening they're sort of the the government regulation level at which something like the general data protection regulation intervenes there's the corporate level there's an individual level there's all sorts of like tech worker activism and organizing that's been happening in the past few years and gained a lot more attention to things like tech won't build it recent Amazon actions just in the past few days I think maybe you could say a little bit about I get anyone of these different levels that speaks to you are how you how you would address you know whether it's the government of the corporation or the tech workers get a coalition or the individual person worried about their privacy yeah how do you think about approaching those different kinds of agents and telling them about their agents in Yemen as a social scientist all things to me about actor incentives and systems of power so maybe might suddenly controversial opinion here is that I don't think tech workers should have to unionize if they can do a bit of a Twitter debate the other day with someone about this what they were taught they but this article about and they were talking about sort of a scorched earth strategy if you join a company and you find an ethical and I'm like you know that is something that a highly privileged person can do because you can go without having a job you can afford to be unemployed you can also afford to potentially have a you know have a negative mark on your employment history in these are all considerations that you know and and frankly it's a massive burden for someone to do you feel the responsibility of the almost single handedly feeling like that they need to push this change I think anyone who's done it has only done so at like very significant like emotional and often financial costs and that doesn't get talked about frankly so like quality that I think the people who can do that shouldn't have missed this Twitter filling out some it was a very you know there's obviously a very friendly conversation but I I'd like I felt like you know kind of a controversial opinion there and of course I think collective action is important I think the fact that these workers are working collectively is what's important here but this notion of this individual whistleblower this individual person is rarely go all out this is Paul make me for a few reasons others is really great Rebecca Solnit piece called a called when the here is the problem and it's been absolutely inspirational to me when thinking about why we need like the change in the system and it's not a please individual actors so what she talks about in in the article is whenever she tries to create a story about like a feminist collective or movement she often gets all he can I can too just like write another story about quite a turn burger with Peter Ginsberg in which she talks about is when we have these individual heroes what it does is it it and not just absolves the individual the non hero from action but it it also dissuades us through collective action and it makes us think that my little bit that I can do like unless I'm willing to go all out you know like quit my job whistle blow you know it's cetera do something illegal it's not worth it I think collective action has really what's been what's motivated a lot of the change so that's kind of in the bottom up from the top down what I find really interesting is hearing lawmakers trying to grapple with how to regulate AI what to do about it and I think it is such a lawmaker perspective they're trying to understand what this technology even me right and then also had a cheap regulation around something that seems to be constantly shifting and changing and how important do you think that is I mean I know different people take different positions on this I remember when mark Zuckerberg testified in Congress all sorts of people were clowning the way people saying though they say the Facebook they don't know what it is then I found myself to my surprise defending these yeah members of our democracy but if you could just bending the the the Congress people saying you know like they might not know how the chemicals work but that doesn't we can't regulate drug I mean they don't even know right and to learn how to code to be able to regulate it yeah where do you come down on this question of how lawmakers are policy makers need to be educated on technology yeah I mean I think it's about a question that I get asked about like media and journalistic asked about you know just the general public I think the important person is and the potential impact of it so you're right you don't need to understand how a particular chemical reaction works need to understand that you know when a particular chemical is in the water it can lead to birth defects right what they need to understand is that the of this if we don't have the right level of explain ability in transparency algorithms harmful things can happen here's how harmful things can happen so it's it's about them understanding how to map out the risk an impact space and regulate or you know past guidelines along that help companies create the right kinds of guard rails not like maybe like a side thought here so I know a lot of that narrative about tack gets dominated by the big tech companies but what I actually find a lot of the clients I work with who are adopting this technology they're they're not to to them technology is not precious it's one way that they can provide their good or service to customers with their action was interested in is providing their widget to customer access and in in doing so like there they're looking to trust in this technology that they too maybe don't know a lot about so they want to be sure that it's going to work the way they want and they're actually looking for basically like a like dot A. guidelines are guardrails for it the government or whoever to tell them yeah this is actually appropriate or inappropriate use of this technology is the way you should or shouldn't do this so a lot of my clients actually does a test tech tech verse but they're a little wary at this point and they're very very concerned about the negative impacts this could have on customers because they're ultimately just concerned about the longevity of the organization and using their useful lessons from sort of the business fear our company's and customers to be drawn for policymakers and citizens I think there's a lot of parallels I think companies do tend to get more immediate feedback from customers because it like quarterly revenue etcetera where as a politician kind of you have to wait for your reelection cycle but there is this like level of responsiveness and I think a lot of lawmakers have gone back to their constituencies have understand like what my problems at home when it comes to this kind of technology and what does it mean to the people who live in my district I think it's actually very compelling I mean one thing I have noticed like the Americans Americanism is active and fast in the last few years is how much more local politics has become I think for a while it was it was quite global for awhile then it became very nationally focused and now it's actually become very very local and people are very concerned at what happens in their district it and I think this also translates with how we think about this technology people are thinking very much about how does that affect my day to day has affect my daily commute or you know how safe my children are and you think that's a product of technology is getting woven into the fabric of everyday local life more that's I think that's part of and the also the other part is this is hyper personalization everything is very very tailored to us now for better or for worse thing is like another hour long conversation we can have on whether or not it's making people sort of more self absorbed in with selfish this constant inundation of things that are supposed to be made for you but I I do think it does lead to in a positive way people pushing back and saying this is not the way I want this to be and I know this is a bit of a of a non sequitur but I know you do have this critique of the smart city or the idea of the sort of personalized POL es coming into us through its through algorithmic systems do you want to speak a little bit to that again this sort of feeds into some of the thoughts and fears I have about the surveillance state right now we're in the space room a lot of governments want to provide better services to their their citizens by instituting quote smart cities now if you ask anybody what's the difference in a smart cities of Allen St I don't think people could tell you and I can guarantee you within the next year they will be synonymous with each other and fundamentally the problem is in the value proposition when people are sold when lawmakers are sold the idea of a smart city it's always business there to safety and security right we're gonna catch the bad people we're gonna find missing children you know it's always about inherently it's about stopping the the bad people and in doing so just like with these home surveillance technologies there has to be an other that has to be a bad person guess who the bad people are there the people we've always vilified when I can invent new villains which is going to find new ways to be classified people as the old villains we've always had so my my fear actually is that and this one was actually goes back to my dissertation now that it became a think about it is that this this value proposition is not how society is for how society works we think about the literature on the social contract on the rule of law we don't adhere to the law because we're so afraid of being caught by the police we are here to the law because we've all agree that's the best way for society to work we do the right thing because we feel like we should not because we're so worried that some drone is going to pick me up on camera and my concern here is that by over policing we're gonna road that social fabric we're gonna get bystander effect we're gonna get people not wanting to be involved because they'll be worried about maybe liability will be worried about you know being tracked leader and ultimately I'm I'm very concerned that just this concept of a flourishing society will actually be chilled if we are over policing every little action that people do care that's fi taught a class called a people's history of the internet Leicester last year and I did the two units that flow into each other at the Chinese theater the Chinese Nisar its growth outlook labs and then we do that Chinese surveillance state which is a topic that has gotten even more poignant since I taught the class you know yes ago or whatever it was but but I hear that are there any practical let's see to put it another way are there ways that the smartness of the smart city can be re purposed to a more productive and is there any use for the Nestor the electors after and maybe get in trouble for saying that but is there any way to sort of re re program the the smart city without doing the whole concept away yeah I mean and I think often and this sort of feeds back into my critique of this idea of Teknaf this's and you're just gonna build the technology thrown up in the air and the it lands where it's made and what where it where it may the the intent behind something is very critical and ultimately shaping where it goes so as I was saying I think the problem with smart cities becoming synonymous with surveillance states is it's built on this narrative a fear which ultimately is exclusionary what I instead think we should imagine the smart city as is an extension of urban planning and urban design so what if this is basically a digital urban planning where it's a way of kind of merging our analog and digital selves in real time in real space and frankly like anybody you know the young millennial and Gen Z. like their digital analog lives already very very marriage can you imagine like a fascinating physical manifestation of what this would be and if that is our value proposition for smart city should be it is by definition inclusive because it is about engaging all citizens it is about providing services to people as they wanted what is not about that is saying you know is inspiring fear it's actually inspiring collaboration instead so what what I my dream would be is you know if if we actually put the notion of a smart city in the hands of urban planners and urban designers the actual public about some really fascinating ways to use the same technologies were using to track and surveil bodies and instead use it to actually improve people's lives they answer the questions that we actually think about every day which is like you know is my bus going to be on time in San Francisco now the media will always be late you don't need the algorithm yeah this is going to be late taken over or lift or whatever or you know like are my children going to school on time it's like the myriad of questions that we ask ourselves on a day to day basis which actually aren't about surveillance or about a safety and security or you know so and I think that would be really interesting way to repurpose this notion of a smart city in a way that's more collaborative I know there are some places that are really trying to think about this I was at the Barcelona smart city summit and I'm really fascinated by the way Barcelona has pursued the notion of smart cities they're using busy crowdsourcing platform to get citizen input real time on how to shape these technologies is not much transparency involved so we do have some early paradigms of what it could look like and I think it's worth exploring and his ownership of data matter I mean doesn't matter who a particularly I suppose now that we have machine learning there's a sort of future oriented for speculative quality the data in a way where we gather and by we I mean large corporations and states not us fragments but I gather large amounts of data in the anticipation that some useful or valuable kinds of patterns can be discovered in it in this context of the kinds of machine learning capacity is we they now have doesn't the ownership of data matter and how do you think about that in this reimagining of yeah mark I guess there so the.

Coming up next