Kurzweil, Bill Joy, Nick Bostrom discussed on TechStuff

TechStuff
|

Automatic TRANSCRIPT

And so. Oh complex they will effectively emerge as their own species in nineteen Ninety-three he pen to paper, titled the age of robots and this is a quote from that piece, computer, 'less industrial machinery exhibits the behavioral flexibility of single celled organisms. Today's best computer controlled robots are like these simpler, invertebrates a thousand fold increase in computer power in this decade should make possible machines with reptile like sensory and motor competence properly. Configured such robots could do in the physical world what personal computers now do in the world of data act on our behalf as literal minded slaves. Growing computer power over the next half century will allow this reptile stage will be surpassed in stages producing robots that learn like mammals. Muddled their world like primates, and eventually reason like humans, depending on your point of view humanity will then have produced a worthy successor or transcended inherited. Limitations and transformed itself into something quite new no longer limited by the slow pace of human learning and even slower biological evolution intelligent machinery will conduct its affairs on an ever faster ever, smaller scale, until course, physical nature has been converted to fine grain purposeful thought. Now, his ideas are predicated upon the assertion that consciousness, which is a quality. That's devilish difficult to define the fact, I would argue it's just as difficult to define as the term intelligence. He argues this arises from the material that is the mind is totally the product of our nervous system. Or if you wanna be a little more generous, the combination of our nervous system and our interactions with our environment's. So in other words, consciousness emerges from a system if that system meets the physical criteria if true, and I happen to believe that this is true that it then stands to reason that if you have a sufficiently complex system with powerful enough machines, we should be able to create an artificial entity that possesses consciousness if however consciousness arises from some other scientifically undiscovered or even undiscovered Rable quality, then it wouldn't matter. How complicated we build our toys, they would never become conscious. So in other words, if consciousness were the emergence from some other thing that science cannot address like a soul. For example. Then there's no way that we could create a conscious artificial being we can't create the soul. If that is in fact, how it works. I personally feel that that's not the case that our consciousness does arise from the material that it does come from our nervous system the complexity and the electrochemical processes of our nervous system. The question. I have is whether or not we will ever be able to replicate that in an artificial system not saying that it would be impossible just wondering if we will ever figure it out. It remains an open question. Nick Bostrom who served as the director of the future of humanity institute has written extensively about trans humanism. I talked about that a second ago that idea that we transcend being just humans. Through some process. Whether that means a computer, augmented person or a biologically, augmented person isn't really important at least from this perspective. It's very important from an ethical perspective. But he's using trans human to indicate this describes someone who has moved away from what we would define as being a human being today, and like kurzweil he has hype assize that the singularity will bring along with it, some means of extending our life spans indefinitely, but he feels that some of the more aggressive predictions are a little too optimistic. He has said that he felt that there is a less than fifty percent chance that we'll have developed any sort of superhuman intelligence by the year twenty thirty three he thinks it's going to happen. But it might take a bit longer than that some of the people who believe or have formerly believed the singularity to be around the corner aren't convinced it's necessarily going to be good for us. Venture capitalist Bill joy who co-founded SUN Microsystems has expressed concerns about it, and it wouldn't necessarily take a superhuman AI to do damage to us joy has pointed out that technology tends to advance our capabilities in all sorts of areas, including destructive ones..

Coming up next