Check out our latest products
Researchers at Osaka University have developed a facial expression technology that uses “waveform movements” to represent gestures like breathing, blinking, and yawning as individual waves.
Even if an android looks so realistic that it could be mistaken for a human in a photo, seeing it move in person can still feel unsettling. While it can smile, frown, and show other familiar expressions, it’s hard to determine what it’s truly feeling, which creates a sense of unease due to the lack of emotional consistency. To address this, a “patchwork method” has been used for robots with facial movement capabilities, like androids, which involves setting up pre-planned actions to prevent unnatural facial movements and switching between them as needed. However, this method comes with challenges, including the need for detailed action scenarios, ensuring smooth transitions between movements, and fine-tuning expressions to appear natural.
Researchers at Osaka University have introduced a dynamic facial expression synthesis technology. This method uses “waveform movements,” where different gestures that make up facial expressions, such as “breathing,” “blinking,” and “yawning,” are represented as individual waves.
These waves are transmitted to the relevant facial areas and layered to create complex facial movements in real-time. This approach removes the need for extensive pre-set action data and ensures smooth movement transitions.
Additionally, using “waveform modulation,” which adjusts the individual waveforms according to the robot’s internal state, changes in conditions like mood can instantly be reflected in its facial expressions.
Advancing research in dynamic facial expression synthesis will enable robots with complex facial movements to display more lively expressions and convey mood changes that respond to their surroundings, including human interactions. This progress could significantly enhance emotional communication between humans and robots.
Further development of a system where internal emotions are reflected in every detail of an android’s actions, rather than just creating superficial movements, could create androids perceived as having a heart.
By enabling adaptive emotional adjustment and expression, this technology is expected to significantly enhance the value of communication robots, allowing them to interact with humans in a more natural, human-like way.
Reference: Hisashi Ishihara et al, Automatic Generation of Dynamic Arousal Expression Based on Decaying Wave Synthesis for Robot Faces, Journal of Robotics and Mechatronics (2024). DOI: 10.20965/jrm.2024.p1481