Sensory technology allows users to direct robots through hand gestures and brain waves.
We have seen the different innovative ways in which robots can be controlled. For example, last year, researchers at MIT developed a system which enabled users to control robots remotely. This VR system allows workers to operate machines from anywhere worldwide.
Similarly, just two months ago we saw the creation of the TrashBot in Chicago. This innovative robot can be controlled by online gamers across the world. Consequently, the gamers can control and direct these garbage-collecting robots, cleaning up the Chicago river. MIT have taken last year’s discovery even further, making it easier and more intuitive to control robots from a distance.
The system was designed by researchers from the Computer Science and Artificial Intelligence Laboratory at MIT. The technology works by monitoring hand gestures and brain waves from whomever is controlling the robot. The user wears an electromyography sensor on their forearm. This monitors their muscle activity, and the wearer need only make small gestures to correct the robot’s movements. The system uses electroencephalography to monitor the brain’s activity. This detects in real-time any errors the wearer might notice the robot making.
In the words of the project supervisor Daniela Rus: “This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback. By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.” The system is unique since it does not require the users to be trained, instead the robot adapts to their instructions. This technology revolutionizes how humans could manage teams of robots in the future.
This breakthrough could have a significant impact on the role of robots in society. Will there come a day when robots are controlled in this way in everyday life? How else could they become easier to control?