A hyper-realistic Einstein robot at the University of California, San Diego learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to empower their robot to learn to make realistic facial expressions.
To get the incredibly realistic Einstein robot to make facial expressions, researchers used to have to program each of its 31 artificial muscles individually through trial and error. Now, computer scientists from the Machine Perception Laboratory at the University of California, San Diego have used machine learning to enable the robot to learn expressions on its own.
Starting with a bunch of random movements of the face “muscles”, the robot is rewarded each time it generates something that is close to an existing expression. It has slowly developed several recognizeable expressions itteratively.
To begin teaching the robot, the researchers stuck Einstein in front of a mirror and instructed the robot to “body babble” by contorting its face into random positions. A video camera connected to facial recognition software gave the robot feedback: When it made a movement that resembled a “real” expression, it received a reward signal.