Software can send commands from a brain to a robot
A team of researchers at KAIST developed software that can read a person’s brainwaves to control a robotic arm, the institute said Wednesday.
Led by Professor Jeong Jae-seung at KAIST’s Brain Dynamics Laboratory, the brainwave-reading software can read and interpret a person’s brainwaves and command a robotic arm to execute orders at 90.9 percent to 92.6 percent accuracy.
According to the team, existing software can only carry out basic commands like "move" and "stop." Movement is restricted to up and down or side to side.
With the new software, 24 different types of movement can be achieved, according to Kim Hee-hoon, Assistant Professor at the Department of AI Convergence Engineering at Kangnam University and member of the research team.
The technology is referred to as a brain-machine interface.
“There has been a lot of research on brain-machine interface, but this is the first in which we’ve successfully interpreted brainwaves to command movement in 24 different directions,” said Kim.
The accuracy of the software was made possible through an artificial intelligence (AI) program that understands and interprets different electrical signals sent by brain cells, according to Kim.
It has many potential uses. For instance, people without limbs can use the technology to move synthetic arms. Online, people could use it to move avatars around the metaverse.
It will be available for commercial use within the next three to five years, according to Prof. Kim.
BY YOON SO-YEON [firstname.lastname@example.org]