A highlight from Lawfare Archive: Shane Harris on Drones


Welcome to the law fair. Podcast i'm benjamin wittes. This is the first of what we hope will be a regular series of podcasts covering a range of audio content related to law and security events interviews roundtable discussions and other types of material. We probably haven't thought of yet. Today's subject is a remarkable new article by journalists. Shane harris entitled out of the loop. The human free future of unmanned aerial vehicles. The essay is part of the emerging threats series published by the hoover. Institution's correct toby task force on national security and law. Of which i jack goldsmith and ken anderson are all members. Shane is a senior writer at washingtonian magazine and the author of the book. The watchers which is a lively history of data mining. He stopped by the brookings institution recently to chat about his latest article. I wanna start by asking you. How much of. This essay is real. How much is science fiction. How much you're sure is real how much you think might be real. I assume none of it. Your shore is fictitious. was potently less fiction in it than you might think the fictitious aspects of it or we do not today have autonomous drones. That go out and proud the skies looking for bad guys to blow up in making the decision on their own to to drop a bomb. The fire missile. We don't have that kind of autonomous drones but we could have very easily. Technology is not the obstacle to building a system like that policy is the obstacle to building system like that so for example technologically. Speaking there's no reason that any of the drones orbiting over pakistan right now and keeping video surveillance of terrorist training camps keeping tabs on. Who goes where logging who those people are by the way they walk by their thermal signatures by eavesdropping whatever. There's no reason why you couldn't train that drone is already watching that camp to at an opportune moment. Take out the person who you say is target number one at that camp. That's just a matter of programming. The drums do that politically and policy-wise speaking that's a fiction right. That's we don't have that right now. The other sort of fictitious aspect of this next to start the with it is you know. I don't think that we're on the verge of autonomous drowns becoming self aware and taking over the world allying themselves with each other in overthrowing their courageous creators allow the terminator movies. But when i found really impressive and compelling was that every time i interviewed somebody about this somebody who is a technology expert worked unmanned systems. And you sort of go in that direction when you talked about autonomy will. How autonomous could they get. Everybody would joke by saying. Oh yes then. We've created sky net. Which is the reference to the terminator movie right. And they're serious about that. I mean they laugh about it almost as a way of alleviating the tension and pretending if we maybe if we just mentioned laughing wait will actually happen they do. I think look as an outcome that could be real business science fiction and that is something that probably most people i talked to would like to avoid. But you know it's definitely outcome that they're aware of that would be the science fiction element but it you know it's more real than you might think issue and and how in your judgment close are we to a world in which we could make self aware systems whether armed or not twenty years. I guess if you if you wanted to say self-aware to the degree that you could not just pre-programme drowned to go out and do a certain mission and then fly back which would be a fairly simple set of tasks to actually you know decide when to take off decide how to go into formation decide when to strike you're talking about striking but those are all operational decisions i mean self-awareness is actually something a little bit deeper than that right. Self-awareness is the drone or the robot. Where the being has a sense of itself and its own ambitions desires And i think the the anxiety that underlies your piece is the concern that we're creating these systems that over time may have their own agendas and i and i guess the. That's that's the sort of futuristic aspect of it that you may have that maybe fiction. But i'm but i'm i'm wondering what the timeframe in which you see that being reality looks like i would say twenty to thirty years and i mean you could safely if you just look through the evolution of you know processor speed with how far we've come in in the time we've come so far with unmanned systems. Tom systems church one to save. If i'm wrong if it's twenty thirty two and we're five years away. We'll let you know pretty amazingly predictive on my part. I think part of the answer that though is i don't know that we've built anything in a robotic sense that is completely self-awareness sense of what we think of. Being aware of surroundings trying to heal itself. But if you look at Information networks cyber networks which are largely automated right now for defending against things like viruses and then attrition attempts in that kind of thing the network architecture is if you like use that analogy self-healing and you can set of a network. That knows when to shut off a piece of the network to deny a virus or intruder entry through a particular port system in that is an effect self-healing and then can update itself with antivirus patches. In the like. I don't think it's that much of a stretch to think about in ten years. Networks computer networks. That essentially functioned largely on their own and in that sense is the network aware of a threat against it as the networking You know cyber traffic jams is something that he has to route around. And and and you know. Is it interpreting. Other malware as as enemies that needs to defend against other by sort of putting up electronic barriers or even sending out m- our of its own to counter it. I think in computer network architecture. That seems so like something that people were trying to do. Right now

Coming up next