The Breakthrough that Makes Robot Faces Feel Less Creepy
The EMO robotic head learned lip-syncing by observing its own facial movements and hours of YouTube videos, enabling speech and singing with 26 motors, per Columbia researchers.
7 Articles
7 Articles
The breakthrough that makes robot faces feel less creepy
Humans pay enormous attention to lips during conversation, and robots have struggled badly to keep up. A new robot developed at Columbia Engineering learned realistic lip movements by watching its own reflection and studying human videos online. This allowed it to speak and sing with synchronized facial motion, without being explicitly programmed. Researchers believe this breakthrough could help robots finally cross the uncanny valley.
Lip-syncing robot watches your face to speak like you
When it comes to ultra-humanlike Westworld-style robots, one of their most defining features are lips that move in perfect sync with their spoken words. A new robot not only sports that feature, but it can actually train itself to speak like a person.Continue ReadingCategory: Robotics, TechnologyTags: Columbia University, Artificial Intelligence, Voice, Facial
Watch This Bipedal Robot Stop Itself From Falling With a Hoop Skirt!
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! This is one of the best things I have ever seen. [ Kinetic Intelligent Machine LAB ]After years of aggressive testing and pushing the envelope with U.S…
A Real-Life Robot Learned to Lip-Sync Thanks to AI
If you want to make a humanoid robot feel “alive,” you can’t just give it legs and hands. You have to give it a face—and not just a face, but a face that moves the way our brains expect. That’s where most robots stumble. People will forgive a clunky gait or a stiff wave. But a mouth that opens and closes at the wrong moments—what one researcher in the new work calls “muppet mouth gestures”—can make a robot feel oddly lifeless, even unsettling. T…
Coverage Details
Bias Distribution
- 75% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium





