Overview

How Pepper works

Pepper is an autonomous talking humanoid robot who perceives emotions, and adapts his behavior to the mood of the humans around him. Pepper can identify joy, sadness, anger or surprise and respond appropriately, making his interactions with humans incredibly natural and intuitive. 

The Emotion Engine

Humans are emotional beings. When we communicate with each other, we use dozens of emotional cues that provide meaning and context.

At the heart of Pepper is a remarkable technology that analyses what you say, your tone of voice and nonverbal communication cues like the tilt of your head or posture. From these, Pepper can instantly recognize the emotional context of the conversation and adjust accordingly. 

The result is human communication with real emotional connection. Pepper quickly builds empathy and trust, and can offer services and functions that no other robotic platform can offer. 

Voice and Hearing

Using the 4 directional microphones that are embedded in his head, Pepper can detect where sounds are coming from and turn to face people as they speak. These microphones also allow Pepper to analyze your lexical field and your tone of voice helping him understand your emotional context. 

Vision

Pepper has a 3d camera and 2 HD cameras embedded in his facial area. Using this hardware, Pepper processes images with shape recognition software to identify objects, individual faces and their emotional states of the faces around him. 

Movement

The first thing people notice about Pepper is the astonishing flexibility and fluidiity of his movement. He gesticulates with the speed and grace of a human, while his 3 multi-directional wheels enable him to move around freely through 360°, at a maximum speed of 3 km/h. Pepper is actually equipped with 20 engines in his head, arms and back to control his movements with great precision. Lastly Pepper has a high capacity lithium-ion battery, giving him 12 hours of autonomous interaction and movement.