hero-image

Emotional Feedback in App Development

On September 3rd 2015 in Paris, a fascinating set of lectures were given by researchers from the fields of Social Robots Interactions and Human-Robot Interactions. 

The talk ‘Emotional Machine Learning’ was given by Dr. Angelica Lim, software engineer at Aldebaran. Angelica and her Expressivity Team are currently working on three areas. 

As Angelica said in her opening remarks, the goal for Pepper is to be able to detect accurately the user’s emotion and to react to them in an appropriate way. To do this, we have to gather information from several emotional cues from the human we are interacting with. 

Pepper will observe different elements surrounding the users and their interactions, such as smile degree, facial expression, head attitude, the words used or contact with its tactile sensors, to estimate the overall emotion of the user.

So how does Pepper do this? 

Well - the OS for Pepper is called  NAOqi 2.3 and is explained in our documentation. Modules can be added to this OS, using a variety of different languages, in exactly the same way that a Linux OS can be extended using modules written in C++ or similar. 

Angelia demonstrated a module called ALMood which gathers information from other lower level modules - such as the ones from ALGazeAnalysis (for head angles), ALFaceCharacteristics (for face and smile characteristics) or ALVoiceEmotionAnalysis (for voice emotion).

“Part of the reason why we are making these mood metrics is so developers can use them in their application” – Angelica Lim

Now using ALMood with AppsAnalytics, in addition to the myriad of data and sensors you may use in the development of robotics applications, we now effectively have ‘Emotional’ data too. 

Emotional feedback is a rare commodity in development - now we believe this feedback offers great opportunities to explore in application development. 

Since the applications for Pepper and NAO are based on interacting with real people, adding emotion as a new metric by which to measure your application characteristics could allow you to optimize that interaction. 

The robot will adapt to the emotional reaction of the user, bettering the relationship between user, device and experience. 

The potential here is incredibly exciting for all developers – not just those interested in robotics. We’ll keep you posted. 

Doctor Lim at the event


The Method List
Image Uploads: