How to control robots with brainwaves and hand gestures

Automatic TRANSCRIPT

Getting robots to do things isn't easy. Usually scientists have to either explicitly program them or get them to understand how humans commu. Indicate via language. But what if we could control robots more intuitively, using just hand gestures and brainwaves a new system spearheaded by researchers from MIT computer science and artificial intelligence laboratory otherwise known as c. sale aims to do exactly that allowing users to instantly correct robot mistakes with nothing more than brain signals. And the flick of a finger building off the teams. Past work focused on simple binary choice activities. The new work expands the scope to multiple choice tasks, opening up new possibilities for how human workers could manage teams of robots by monitoring brain activity. The system can detect in real time if the person notices an error as a robot does a task using an interface that measures muscle activity. The person can then make Hingis Gers to scroll through and select the correct option for. Four said robot to execute the team, demonstrated the system on a task in which a robot moves a power drill to one of three possible targets on the body of a mock plane in portly. They showed that the system works on people. It's never seen before, meaning that we're gonna Zeh shins could deploy it in real world settings without needing to train it on users. This work combining EEG an EM feedback, Nabil's natural human robot interactions for a broader set of applications than we've ever seen before. Meanwhile, researchers have harnessed the power of brain signals called error related potentials, which they have found to naturally occur. When people notice mistakes. If there's an e r p the system stops. So the user can correct it. If not, it simply carries on what's great about this approach is that there's no need to train users to think in a prescribed way. The machine adapts to you and not the other way around for the project team used Baxter a humanoid robot from rethink robotics with human supervision. The robot went from choosing the correct target seventy percent of the time to more than ninety seven percent of the time. The team says that they could imagine the system one day being useful for the elderly or workers with language disorders or even limited mobility approaches like this show that it's very much possible to develop robotic systems that are a more natural and intuitive extension.

Coming up next