How to Make Lifelike Robots with a Soul
Posted on |
A common misconception is that robots need advanced AI algorithms to have fluid interactions with people. Another myth is that robots need to have some special shape to be engaging.There’s actually a better, simpler, and more elegant way!
Secrets of Engaging Interactive Robots
Guy Hoffman from Cornell University introduced a solution that requires 2 complementary ingredients. They were introduced in his awesome talk at TEDx Jaffa entitled: “Robots with Soul” (see Video 1 below).
The first secret comes from acting and animation technics. It lies in making the robot move in a way that exhibits emotion. It doesn’t matter how something looks. It’s all in the motion, and in the timing of how the thing moves.
The second pillar of Hoffman’s approach is inspired by a trend in cognitive psychology, called embodied cognition. It can be summarized as following: robot body postures feedback into their brains to generate the way that they behave. Applying these concepts to robots result into playful, reactive, and curious robots that people enjoy interacting with.
The concepts presented in the TEDx talk are described in more details in different publications co-authored by Hauffman. I’ve made a a selection of articles in the reference section below. For each article, you’ll find the abstract, as well as a link where you can download the PDF.
Video 1. Robots with Soul, Guy Hoffman, TEDx Jaffa
3 Ingredients to Make Lifelike Robots
The next step beyond making engaging robots is to make them look alive. For this, robots need to exhibit need. According to Christine Sunu, need is the minimum for the perception of life by live beings. In her TEDxSoMa talk (see Video 2 below), Sunu states that lifelike robots react to our input in a way that feels like self-preservation. They trigger our empathy by showing their need to live.
There are 3 elements that make up the need illusion:
- Purpose: This is a drive that influences every decision. Purpose is created through consistency. Under similar conditions, the robot have similar behavior.
- Emotion: A lifelike robot should for example show joy when accomplishing a desired task, and distress when it can’t. This is what makes the difference between robotic consistency and lifelike purpose. However, emotions should be unpredictable to have a staying power. Lifelike robots should have multiple ways to exhibit the same emotion, and use them randomly with varying levels of intensity.
- Story: This is the context in which we interpret the appearance of a robot, the way it feels, or it sounds. It heavily relies on associations with things we already know, and that trigger human’s deep instincts.
Video 2. Bringing Robots To Life, Christine Sunu, TEDxSoMa
Engaging Commercial Robots
The idea of relying on motion to express emotions has been successfully applied to Cosmo, a cute palm sized programmable robot by Anki. Cosmo can roam around, move his head, and lever/arms to perform some actions such as rolling or pushing objects. Its face is very expressive since it is displayed on a small screen. It evolves in all kinds of ways to convey a wide range of emotions. This is nicely complemented with funny sound effects (see Video 3).
Video 3. Cozmo: the Playful Robot with Emotions
Before closing shop, Anki did release Vector the successor of Cozmo. Beside face recognition already available in Cozmo, Vector introduces voice commands, and the integration with Amazon Alexa (see Video 4). So, Vector can look up stuff for you on the Internet, or control your automated home. All is done in a fun way, thanks to emotions expressed by this little robot.
Video 4. Vector Companion Robot with Voice Command
Learn More About Lifelike Robots with a Soul
- Embodied Cognition for Autonomous Interactive Robots [PDF]
- Hoffman, G. (2012)Topics in Cognitive Science, 4(4), 759–772
- Abstract
- In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior.
- This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human–robot interaction based on recent psychological and neurological findings.
- Designing Robots with Movement in Mind [PDF]
- Hoffman, G., & Ju, W. (2014)Journal of Human-Robot Interaction, 3(1), 89–122
- Abstract
- This paper makes the case for designing interactive robots with their expressive movement in mind. As people are highly sensitive to physical movement and spatiotemporal affordances, well-designed robot motion can communicate, engage, and offer dynamic possibilities beyond the machines’ sur- face appearance or pragmatic motion paths. We present techniques for movement centric design, including character animation sketches, video prototyping, interactive movement explorations, Wiz- ard of Oz studies, and skeletal prototypes. To illustrate our design approach, we discuss four case studies: a social head for a robotic musician, a robotic speaker dock listening companion, a desktop telepresence robot, and a service robot performing assistive and communicative tasks. We then re- late our approach to the design of non-anthropomorphic robots and robotic objects, a design strategy that could facilitate the feasibility of real-world human-robot interaction.
- Emotionally Expressive Dynamic Physical Behaviors in Robots [PDF]
- Bretan, M., Hoffman, G., & Weinberg, G. (2015)International Journal of Human-Computer Studies, Volume 78
- Abstract
- For social robots to respond to humans in an appropriate manner they need to use apt affect displays, revealing underlying emotional intelligence. We present an artificial emotional intelligence system for robots, with both a generative and a perceptual aspect. On the generative side, we explore the expressive capabilities of an abstract, faceless, creature-like robot, with very few degrees of freedom, lacking both facial expressions and the complex humanoid design found often in emotionally expressive robots. We validate our system in a series of experiments: in one study, we find an advantage in classification for animated vs static affect expressions and advantages in valence and arousal estimation and personal preference ratings for both animated vs static and physical vs on-screen expressions. In a second experiment, we show that our parametrically generated expression variables correlate with the intended user affect perception. On the perceptual side, we present a new corpus of sentiment-tagged social media posts for training the robot to perceive affect in natural language. In a third experiment we estimate how well the corpus generalizes to an independent data set through a cross validation using a perceptron and demonstrate that the predictive model is comparable to other sentiment-tagged corpi and classifiers. Combining the perceptual and generative systems, we show in a fourth experiment that our automatically generated affect responses cause participants to show signs of increased engagement and enjoyment compared with arbitrarily chosen comparable motion parameters.
- Embedded.FM Episode 242 on Emotive Design
- Christine Sunu interview hosted by Elecia & Christopher White (2018)
This site uses Akismet to reduce spam. Learn how your comment data is processed.
Leave a Reply