But what is it that drives us to make machines that look like us?
It certainly isn’t easy.
One of the hardest parts is giving humanoid robots something that we humans take for granted,
our ability to walk on two legs.
Robots with four legs are more stable,
because more legs means more points of contact
with the ground, and less engineering is needed to make a workable robot.
But two legs provide a sense of familiarity that helps humans empathise with the machines.
Teaching robots to walk used to involve a lot of trial and error.
Robots like ASIMO took several years of research and engineering to stand and walk around.
With the advent of machine learning, freestanding robots are learning to take their first steps
by training on thousands of hours of simulations,
even before they have legs.
So we can get them to stand up.
But why on Earth do we need upright humanoid robots?
What can they do that other robots can’t?
It turns out, a whole lot.
Researchers at the University of Tokyo are trying to build a complete human analogue,
with a metal skeleton and hydraulic muscles.
Their current models, Kenshiro and Kengoro,
can perform many of the same movements as humans,
including push-ups and sit-ups.
Aside from making great bodybuilding coaches, the researchers say they give us insight,
into the intricacies of human physiology.
But there are some downsides.
One of those is psychological:
When robots start looking too “real,” they unsettle us,
causing us to trust them less.
That hasn’t stopped scientists from tackling the daunting task
to make robots look and act more human.
In 2004, European researchers began work on the iCub, a robot designed to mimic a child.
iCub is the hardware that houses the artificial intelligence software that powers it.
The robot can observe its environment through camera eyes and microphone ears,
touch sensors in the hands, and a “skin” made of printed circuit capacitors.
The motivation for creating iCub is to give AI not only a voice, but also a body.
iCub has been trained to observe and respond to its surroundings.
By watching humans, the robot has learned to grasp and manipulate objects.
Some of the project researchers have even taught it to identify the objects it picks up.
But one of iCubs most valuable lessons for the field of robotics is in sociability.
Researchers are training the robot to maintain eye contact with humans,
determine their emotions through facial expressions and gestures,
and respond to nonverbal communication
in order to complete tasks.
This kind of research could mark the beginning of a new era in human-robot interactions.
But what about those science fiction dreams?
One day, we’ll ask the robots what they think.
Hot Robot At SXSW Says She Wants To Destroy Humans | The Pulse | CNBC This new material heals—not cracks—under pressure Watch water droplets literally jump off a ‘water-hating’ material Do Robots Deserve Rights? What if Machines Become Conscious? This robot made of algae can swim through your body The Most Advanced Robots in the World One of the loudest underwater sounds is made by an animal you wouldn’t expect 5 Coolest ROBOTS You Can Actually Own! Armed with tough computer chips, it's time to return to the hell of Venus Should We Let Robots Take Our Jobs?