Joseph Depaolo Tonio, Neil Raden discussed on DM Radio

Automatic TRANSCRIPT

All about ethics and artificial intelligence with my good friends and colleagues and fellow analysts Joseph DePaolo Tonio and Neil Raden and done deal I'll bring you back and Joseph was gonna get my mind rolling there in that last segment talking about this ethical framework and how it could be very useful in helping us in our day to day human behavior right that's the whole thing about ethics Duke doesn't ethical standard actually in bed itself in your behavior in your decision making process and then I had my bit of it apparently there that yeah actually we could see this articulated through our technologies through Microsoft teams for example which tells you when you're not getting enough alone time that enough focus time on your apple iPhone when it tells you that you know you should sit up or walk around or things of this nature which is I mean it's kind of weird but it's kinda cool I don't know what your thoughts on all that deal I don't completely agree I think that it's not like self improvement or yoga or eating a paleo diet I think there's an essential tension between the can comic aspects of A. I. in the ethical aspects today and the people who are producing it up are producing it to make money and then there's also the unintended consequences a good example there's a lot right now about the ability of a I like to do a better job finding cancer tumors in breast cancer than the best radiologists well you might think that that's a good idea but the problem is what happens after that is not necessarily such a good idea people end up getting a lot of unnecessary treatment the organization's inflate their ten year survival because the early detection is ten years before but it turns out the mortality isn't the same thing so was it really a good thing and I think those are the kind of questions that need to be answered but the push of technology and the push of science it's not really paying attention to it I I also think that there's this there's this agonizing desire to push this thing forward one thing that I'm doing a concert with somebody else's we've developed an actuarial A. I. certification program so we actually go out to insurance companies and train them for a few days about all the bad things we can do and hopefully guide them to sort of the good things they can do but on a daily basis they're under pressure to do things that have to do with making money for the organization so it's not it's not a perfect situation the other thing is is that you know the E. U. is way ahead of this and putting out ideas about this also has the office of the the president's office is something which was a joke by the way anything and if that wasn't already said before it had no cheese of the OECD which represents thirty seven countries but now you have some states like New York and California and so forth that are going off on their own but the one that really troubles me is that Washington state just signed into law about facial recognition and if you read the document it looks really good the problem with it was it was written by Microsoft now Microsoft is out lobbying all the other states to adopt it and when when you get down to the nitty gritty there there's some things in it they're not okay I think that this clause that said something about you can do it without a warrant if there's called exigent circumstances exist what the hell is that me right so this is your trouble this is a struggle it's gonna be it's gonna be a union all the way and I don't think that developing frameworks out of the blue really gonna be that valuable yeah I know this is the peak of a really good point right but it and this is kind of the issue of trying to wrangle with myself here as we think through all this and maybe Joseph I'll bring you back in here yeah Neil is is basically alluding to what I see happening as well but it goes a bit deeper right and it goes deeper because we have these companies to suffer companies that are so incredibly powerful Google YouTube of course part of Google Facebook linked in this part of Microsoft now so Microsoft has a huge influence on on organizations and you know I I'm reminded all get philosophical here with you because I was a philosophy double major myself way back when and to win the obamacare the affordable Care Act is being debated I got on a plane that was flying somewhere and I was reading louts suse understanding the mysteries which is fascinating stuff I'm a big fan of of eastern philosophy and eastern thought and I was thinking about the affordable Care Act and open up the book and the first thing I read was when the laws are complex the bandits will abound right this is three thousand years ago this guy wrote that stuff three thousand years ago so a thousand years before Christ and of course the orients this guy's writing when the laws are complex the bandits will abound and I remember I said that one time two of my partner Dr robin Bloor it always has such interesting things his comment was do you know how many laws the penalties it's like no you said seven of course the polyphonic code which shot is still alive in some form today at least in spirit in the Louisiana but the idea was that what the pollen saw as I understand it was that because it becomes so complex yet to start over I need a new coat for governments and I almost feel like we're at that point right now and you see it all this conversation about ethics it's everywhere and it's it's a very hot topic today not just for artificial intelligence for data for commerce for international relations at cetera so I I think we are at this really important time if you look at the history of mankind over the past even three five thousand years or so this is a very strange time now we're clearly at an inflection point because of the internet because the technologies like artificial intelligence at Starbucks so I'll just kind of throw it over to you is a is a big fat softball coming your way what do you think about all that and and do we need to to frankly look very hard and and meeting fully at the nature of laws and regulations in our country around the world right now Eric you really kind of a wild there with that conversation I know I know quite well I got outside the roads here but testing the boundaries well that's a good boundary yeah I mean do we go back to the code of Hammurabi as opposed to the number only on a coat and a look at the evolution from those early days too when Napoleon said things have gotten too complex what simplify it to seven basics and somewhere in the middle there were the ten commandments and how do we look at this I think that reset and you know you mentioned at the beginning of the program about Kobe nineteen and what the pandemic is doing the things right now with that give us a real hard reset or is that going to lead to even more complexity in the regulatory environment one thing we all sure has this tendency to look very much within our own culture and not really look at cultures beyond our own there are one hundred and ninety some countries in the world today there are thousands of different cultures Wendy Oxford morality project derive their seven basic rules of morality that they felt were common they looked at six hundred cultures well there's that's only six hundred but also one of the interesting things from that the talks directly to what you're saying the fit of their sentence is deferred authority and when I heard that I thought no I in my culture is your question authority I'm an old hippie I mean one of my favorite course in college was anarchy one oh one right it'll at winning a deferred to authority as a universal moral code but if you look at it from the standpoint of don't run a red light because you're going to get into an accident that's different authority if you look at it from the standpoint of trying to follow this myriad of complex walls that are on the books you're going to run into things like get in Wilkesboro Pennsylvania you're still not allowed to graze your goats on the town square right and in Pennsylvania overall you still have trouble worse in front of your car with the green flag on a horse I need your car with a red flag to let other carriages know you're coming worthless carriage that's still on the books that never went away wow so this layer of complexity is certainly happening and how do we contact us how do we look at things as Neil was talking about this new Washington regulation but written by Microsoft and one thing I kinda look especially in that area straightforward mission have you ever read David prince the transparent society now that's a really interesting take and it's I think it's over twenty years old articles written in the nineteen nineties I have to double check that about how the only way to avoid trading our privacy for our freedom is by opening up the surveillance state show that anyone can see to it but anyone can draw from it as well one of the recurring themes and some of his future society books are holder people are who used the cameras on their cell phones to protect themselves from muggers more from nasty teenagers kind of the clockwork orange troops who might be attacking them they work after cellphone said I'm streaming your to the nearest police station right right now so how do we balance the privacy in the transparency how do we do it conveniently how do we make sure we're secure and how do we build those regulations indoor frameworks no you you mentioned that you felt the building frameworks out of the blue work going to help but I don't know any other way of actually I mean these things into our A. I. one of the things I look at it is because inference and to me the ability to bring calls inference into machine learning algorithms through directed acyclic graphs such that they understand the causation may become explainable as they move towards cognition means that we can go from contextual is Asian to cognition through calls or in French if we understand and if the A. R. A. if we ever get to that point can understand how both the consequence of their effect anyone and how can we understand the call as a matter of fact so we get these calls consequence diagrams that we used to use in system engineering with ball in logic how do we build that in through directed acyclic graphs and graph algorithms use knowledge graphs that Neil mentioned into.

Coming up next