EBRAINS robot simulation has learned complex hand movements
It is easy to take for granted how complex our hand movements are, but the skilful manipulation of objects is one of the abilities that make humans unique. Even hand movements that may seem simple, like holding a cup of coffee, are actually complex and engage a large-scale brain network encompassing sensory, association and motor regions.
Due to this complexity, it is difficult to imitate the human hand, or even simulate it on computers. But a team of researchers in the Human Brain Project (HBP) took up the challenge, and is using the EBRAINS research infrastructure to simulate a robotic hand with human-like abilities, which is now able to manipulate objects. The HBP team develops brain models that control a human-like robotic hand that was developed by the Shadow Robot Company. Their work combines deep learning, robotics and neuroscientific knowledge and aims to shed light on how the brain coordinates complex hand movements.
To achieve dexterity with a robot hand similar to human hands, the HBP scientists utilize “Recurrent Convolutional Neural Networks”, or RCNNs, to which they added biological constraints, to more closely mimic the biological neural network in the human brain.
VIDEO - Human Dexterity: A Proof of Concept
Human Dexterity: A Proof of Concept
The researchers identified the network of regions in the brain that are involved in the coordination of complex hand movements, as well as their specialised functions and connections. This information was then used to build the architecture of the RCNN, with layers reflecting those of the frontoparietal network and the visual system in the brain.
This artificial neural network was then trained with reinforcement learning in supercomputers at the Swiss National Supercomputing Centre that are part of the EBRAINS infrastructure. This training process requires thousands of iterations of self-generated experience.
With that, each joint of the robotic hand “learns” its movements.
The team has built a flexible framework for training models that are inspired by the human brain and demonstrated its ability to learn the complex task of in-hand object manipulation.
The researchers are analysing the trained model and investigating how it solves the task of in-hand object manipulation, aiming to better understand how the brain achieves the same task. “After all, we are mainly interested in understanding the brain and using models as proxies of the brain”, says Mario Senden, researcher at the Department of Cognitive Neuroscience at the University of Maastricht.
LinkedIn Live interview
A live interview on the EBRAINS robotic hand took place on Wednesday 6 July at 12:00 CET. France Nivelle, Chief Communications and Content Officer at EBRAINS, interviewed Mario Senden, Assistant Professor at the Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, University of Maastricht.