This robot can predict a smile before it happens

STORY: This AI-integrated robotic face can predict a smile before it happens.

It's called Emo and it can anticipate and mimic human facial expressions.

Engineers at Columbia University’s Creative Machines Lab say they believe it is a significant advance in improving interaction quality and fostering trust between humans and robots.

Through advancements like ChatGPT many robots have made strides in verbal communication.

But their ability to take in and express facial cues has lagged.

(Yuhang Hu, PhD candidate, Columbia University)

“As robots become more advanced and complicated like those powered by AI models, there's a growing need to make these interactions more intuitive."

This is Yuhang Hu, a PhD student at the Creative Machines Lab.

"Emo is equipped with several AI models including detect human faces, controlling facial actuators to mimic facial expressions and even anticipating human facial expressions. This allows Emo to interact in a way that feels timely and genuine."

Emo’s human-like head uses 26 actuators for a range of facial expressions and is covered with soft, silicone skin.

It features high-resolution cameras in its eyes for lifelike interactions and eye contact, crucial for nonverbal communication.

The robot was trained using a process termed “self-modeling,” wherein Emo made random movements in front of a camera, learning the correlation between its facial expressions and motor commands.

It was then shown videos of human expressions.

A study published in Science Robotics described Emo as being able to anticipate human facial expressions and mimic them simultaneously, even predicting a forthcoming smile.

Hu is that study's lead author.

"Our next step involves integrating verbal communication capabilities This will allow Emo to engage in more complex and natural conversations."