AI researchers have been developing a robot capable of mimicking human facial expressions.
The artificial robot called Emo can pick up subtle clues in human facial expressions and use this information to predict what your face is about to do.
The result is pretty impressive but quite scary when you see it in action.
Advert
To teach Emo how to do this, the research team at Columbia University put it in front of a camera letting it create a series of random movements.
After a few hours of this, Emo began to learn how its facial movements matched up with different expressions and motor commands.
The team also developed two AI models, with one predicting human facial expressions and the other one generating commands to use the matching facial expressions.
Advert
The developers have been trying to fine-tune the robot's reflexes down to the millisecond. Consequently, Emo can now smile as well as anticipate and mirror human facial expressions within 840 milliseconds.
Thanks to its high-resolution cameras in the pupil of each eye, Emo can make eye contact, crucial for nonverbal communication.
The fact Emo has silicone skin is supposed to make him look more human and less scary - whether it works is another story.
Yuhang Hu, Columbia Engineering PhD student and study lead author, said: ‘I think predicting human facial expressions accurately is a revolution in [human-robot interactions].
Advert
‘Now, the robot can integrate human facial expressions as feedback. When a robot makes co-expressions with people in real-time, it not only improves the interaction quality but also helps in building trust between humans and robots.
‘In the future, when interacting with a robot, it will observe and interpret your facial expressions, just like a real person.’
Currently, Emo can only interact with people through this method of replicating expressions. However, the team is optimistic about merging the robot's physical abilities with a large language model system like ChatGPT.
Advert
Of course, large technological strides like these come with potential ethical problems.
'Although this capability heralds a plethora of positive applications, ranging from home assistants to educational aids, it is incumbent upon developers and users to exercise prudence and ethical considerations,' added team leader Hod Lipson.
'But it’s also very exciting, by advancing robots that can interpret and mimic human expressions accurately, we're moving closer to a future where robots can seamlessly integrate into our daily lives, offering companionship, assistance, and even empathy.
'Imagine a world where interacting with a robot feels as natural and comfortable as talking to a friend.'