Scientists at United Arab Emirates University (UAEU) are creating a hi-tech robot designed to mimic human movements — resembling something from the epic sci-fi movie Avatar.
The human-like robot – which has the potential to become a tool for bomb disposal but whose use could also be extended into other areas – is designed to give its operator the same control and sensations as if they were using their own limbs, allowing them to feel whatever it touches.
The first stage of the project has been developed, designed and built by Dr Fady Alnajjar, assistant professor in the Department of Information Technology at UAEU and his team of specialists, along with the United Arab Emirates Air Force providing Dh55,000 funding for the initial stages of the self-controlled, robotic creation.
In the movie Avatar, remotely-controlled human bodies are used to interact with the inhabitants of a distant moon. For all the futurism in the film, the basic premise of this form of robot control is closer than you might think. And as Dr Alnajjar explains, the project he is leading could bring a new dimension to robotics because of its advantage over regular robots, which are remote-controlled and have limitations.
It is intended to give its user the ability to feel they are touching or holding an object, without actually being there. “The only way to sense a situation at present via a robot is through vision – there is no sensory or tactile feedback from the environment,” he said. “Also, the user will only be controlling a robot’s hand using a remote-controlled tool, but that is not like using an arm – it doesn’t allow the same level of freedom, such as how far you need to open a hand and hold a grasp. When we control our own arm, we don’t have to think of this type of thing; we don’t have to consider how much freedom we need or how far to open our fingers to hold a cup, or open and close a door. Our brains automatically control this movement.”
The UAEU project aims to, as Dr Alnajjar explained, allow the person in control to simply act naturally in order to operate the robot. “We are building sensors which the operator can wear, and then all they need to do is move as if they actually are the robot,” he outlined.
“They don’t need to think about how they should lean to use the controller – they just move normally, and the robot mimics that motion. A human will have more control if he/she feels like they are actually using their own arm,” he said.
Reach Out: www.uaeu.ac.ae/en