FOX, Twitter discussed on Untangle
Something and think, oh, I just heard it or smelled it or something. And so because when you look in the brain, it's all the same stuff. It's all spikes. And as we know, if you were to, if we were to look at a little hole in the skull and look at the brain underneath it and say, okay, you're looking at a little piece of cortex that's popping off what part of cortex is that. There's actually no way you'd be able to tell because it looks exactly the same as visual cortex or auditory or smell or taste or anything. You know, what you're looking at is a bunch of cells popping off. So anyway, this got me very interested in the question of how the information is getting there in the first place. So when you look at the eyes, these are very specialized little features that pass information in a certain way, the capture photons. When you look at ears, what you're doing is capturing air compression waves and then turning that into spikes and setting it off to me. When you look at the nose, you're capturing molecules of different shapes and then sending spikes off to the brain and so on. So anyway, I started wondering, could you actually send off a whole different stream to the brain? Could you capture a different kind of information and send that off to the brain? And the way that the brain figures out how to deal with any sort of sensory information is by comparing it with other sorts of sensory information and by correlating it with your motor output and stuff like that. So all of this is to say that I started wondering if we could feed in a new sense into humans by using something like patterns of vibration on the skin because your skin is this wonderful computational material that mostly goes wasted in modern life. We're not using our skin for much of anything. Yeah, you got a ton of touch sensors in there. Exactly. It actually, the skin is the largest organ of your body. The joke that I use in the lab is that we don't call this the waste for nothing. So. Yeah, exactly. It's just very rich material and I thought, what if we could put in spatial patterns of vibration, but by which I don't just need through time. I don't mean buzz, buzz buzz, but I mean, you know, like a cross big swaths of skin. And could you actually correlate that with something so let me give an example of what we're doing we're out to solve the problem of hearing loss with vibration on the skin. So what we do is we capture sound into these vibratory patterns either on the torso or on the wrist. We now use a wristband. And deaf people can come to understand the world that way. And it's actually not hard. I mean, the surprising part is that on day zero, when they very first put on the wristband, for example, we play them a sound to the wristband, so they're buzzed. And then we say, okay, was that a dog working or a baby crying? And they have to guess which one it was. And people perform it up to 95% on day zero before they even have any training with it. So whether it's a car passing or a smoke detector or a microwave peeping or a cell phone ringing or whatever it is, it's just sort of surprisingly intuitive, even if you're getting it on your skin because it's getting up to the brain and the brain can interpret it. Totally amazing. So just to recap, there's a lot of information here. So we can take an information and our brain's job is to interpret that. And we know our basic senses are sites, smell, hearing, but you're suggesting that we can actually add additional senses. And so you actually have a little device so you can get little buzzes little tactile sensations right on your wrist. Yeah, because from the brain's point of view, it's just information that's getting to the brain. In the case of the wristband, it's just climbing up your spinal cord and up to your brain. And your brain says, oh, there's this stuff. Here's the important part about correlating it with other senses is let's say I'm deaf. I can watch your lips move and I feel it on the wristband at the same time. And that's how I correlate these things, or I can vocalize something. I can say the quick brown fox. And as I'm saying it, I'm feeling it. And that's how my brain figures out, oh, okay, got it. This is how I interpret this information. So our brains are learning machines and we're simply learning a new pattern pretty effectively. Exactly right. And the way that I've come to think about the brain in the last sort of 7 years is I think about it as a general purpose compute device and whatever information you feed into it, it just says, okay, that's cool. How's it correlated with these other things? And what is the meaning of that information? Wow, so what other stuff can we feed into the brain and derive meaning from? Yeah, so we're working on about 15 different projects here. Some of which I could talk about and so I can't, but essentially my interest is in sensory addition, which is so not just for a person who's deaf or blind or with a sensory disorder where they're not feeling their limb or things like that. We're doing all those kinds of clinical projects, but what other things can we do to feed in brand new sensory information? So whether that's stock market data or Twitter information or flying a drone, things like that, how can we take in new streams of info and come to have a direct perceptual experience of them? And how successful is it? We've done things with drone pilots where they're feeling the pitch. You all roll orientation and heading of their drone as they're flying it, they're filling it on their skin. And they can come to fly in the fog or in the dark and very quickly sort of surprisingly quickly. They just, they get it. They get what the meaning of it is. And maybe it's not surprising. I don't know, when you look at driving a new thing like a skateboarder, a sailboat, or some new thing that you haven't done before, let's say. Learning how to snowboard. Yeah, learning how to snowboard. It just doesn't take that long for you to figure out, oh, okay, I get instead of using my feet to walk and so on. Now I'll just balance and I'll get high velocity by leaning this way and I go downhill and yeah, the part that I keep saying is amazing is just the flexibility of the brain to say, oh, okay, I've got a new body plan now. I've got a new sensory input now. That's cool. I'll just figure it out. So this really opens us up to the notion of what we can become. All of the things that we can be and we can learn simultaneously. Yeah, exactly. And I have a few colleagues who are starting companies to do things with, for example, brain implants where you stick electrodes in the brain and then maybe feed in new senses that way, but the truth is I don't think that is the way this is going to go because neurosurgeons don't want to do these surgeries because there's always risk of infection and death on the table and consumers who want to interface with other things. I don't think they want to go in for an open head surgery. And so to my mind building a very simple cheap device that can do this, I can get new streams of information in the brain, probably the way to go. And I think what we're going to see in the next 5 years is the creation of new senses and part of my plan is I mentioned that we're doing a whole bunch of projects in my lab here. But there are 2000 projects that I haven't thought of and won't think of. And so by releasing this with an open API where people can feed in whatever kind of data streams they want, that makes it a community science project to figure out what kinds of new sensations we can experience. Okay, well, anybody out there who wants to know when they're baby is asleep by taking data from your baby monitor, putting it into your wristband and then feeling every toss and turn of your baby or every beat of its heart if you have an EKG on your little baby. Or any other piece of data that seems relevant to you. It can now transform it and learn it..