Pepper Myo Control

Pilot your Pepper’s motion with a Myo armband
Pepper Myo Control

Make your Pepper move around with only a wave of the hand. Thanks to the Myo armband, a muscle sensor device and the qiSDK’s motion API, you can control your Pepper by hand gesture!

This article is part of the API Challenge: Quentin, an intern at SBRE, is plugging various third-party technologies (web services or IoT devices) into Pepper. For each technology, he has one week to test it, make a demo on Pepper, and write up his experience. This is a way of checking how easy it is to integrate technology with Pepper.

For my last mission as a SBRE intern, I managed (with a few struggles) to control my Pepper with the Myo armband using the Myo SDK, the qiSDK’s motion API and thanks to numerous documentation found on the internet. The result is actually pretty functional and smooth, way better than I first expected.

1. The Myo armband and SDK

1.1 Myo armband

For starters, let’s talk about the Myo armband. It is a small, light armband developed by the North company (former Thalmic Labs) that detects your arm’s muscle activity when worn, thus being able to send callbacks to an app when certain movements are performed by the user. There are 5 pre-registered gestures as depicted in the caption below.

The 5 Myo gestures: Fist, Wave-in, Wave-out, Fingers-spread and Double-tap
The 5 Myo gestures: Fist, Wave-in, Wave-out, Fingers-spread and Double-tap; Credit: MyoTM Support

The Myo armband also includes an accelerometer but I haven't used it in my app.

Note: The Myo armband isn’t produced anymore since the end of 2018, it is pretty hard to obtain one. Also, the website and SDK aren’t updated anymore. But this kind of exercise can also be realised using any Bluetooth connectable gamepad or remote instead.

As I visited the Myo official website, I was guided to the basics of Myo use. I downloaded the Myo-connect app available on the website, plug the Myo bluetooth adapter on my computer and started practicing gestures with the armband. The Myo connect app comes with a few applications you can use on your computer: a mouse control, a presentation control and an RV (recreational vehicle) control. Once I practiced a bit with the mouse control application and really understood how my movements were detected by the armband, I started focusing on the Myo SDK.

1.2 Myo SDK

The official Myo SDK is available on the website in the form of a downloadable file containing, among others, a .aar compressed file I needed to include to my Android app’s Gradle file.

dependencies {
   implementation 'com.thalmic:myosdk:0.10.+@aar'

The websites documentation and tutorials concerning the SDK aren’t very clear which made me look for example projects using this particular SDK on GitHub. I found the Kotlin library used to create a Myo Youtube control app and recycled everything useful for the Pepper control in my app, which concerned, of course, the bluetooth connection to the armband and the callbacks received from the specific gestures listed earlier.

A few words about the bluetooth permission for an Android app: I had to grant my app permissions to use the robot’s bluetooth connexion to link itself to the Myo armband. I did it by adding this line into the AndroidManifest xml file of my app:

<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>

And this import in my MainActivity kotlin file:

import android.Manifest.permission.ACCESS_COARSE_LOCATION

Now let’s talk a bit about the use of the Myo SDK, as I said before, all the code below comes from the Myo Youtube control app I found on GitHub, credits to SergioPadilla. Here is the initiation needed to be done in the onCreate function of the MainActivity:

//create the scan hub
hub = Hub.getInstance()

//get the right bluetooth permission
           ), 1
//hub init
if (!hub.init(this)) {
   Log.e(TAG, "Could not initialize the Hub.")
else {
   // Disable standard Myo locking policy. All poses will be delivered.
   // scan Myo by activity
   // Create listener for Myo
   listener = get_listener()

The hub created allows your app to scan for reachable Myo devices and let you connect to it. Then, the listener will allow some callbacks to be made. Below is the skeleton of the get_listener function which, pretty explicitly triggers actions when the armband is connected or when a pose is detected.

private fun get_listener(): AbstractDeviceListener {
    * Get the listener to myo device
   return object : AbstractDeviceListener() {

       override fun onConnect(myo: Myo?, timestamp: Long) {
           //actions to execute when Pepper connects to the Myo

       override fun onDisconnect(myo: Myo?, timestamp: Long) {
           //actions to execute when Myo is disconnected

       override fun onPose(myo: Myo?, timestamp: Long, pose: Pose?) {
            * This interface in called when Myo detect some pose
           if (pose == Pose.REST) {
               //actions to execute when the Rest pose is detected
           } else if (pose == Pose.DOUBLE_TAP) {
               //actions to execute when the Double-tap pose is detected
           } else if (pose == Pose.WAVE_IN) {
               //actions to execute when the Wave-in pose is detected
           } else if (pose == Pose.WAVE_OUT) {
               //actions to execute when the Wave-out pose is detected
           } else if (pose == Pose.FIST) {
               //actions to execute when the Fist pose is detected
           } else if (pose == Pose.FINGERS_SPREAD) {
              //actions to execute when the Fingers-spread pose is detected

So by then, my app allowed me to recognise those movements and send logs when the callbacks were activated. Was I had to do next was to fill this skeleton and use these callbacks to trigger Pepper’s movements.

2. The qiSDK’s motion API

Since there are only 5 gestures the Myo armband can recognise, I had to decide which will be triggering what. I went for this mapping:

  • Fingers-spread gesture to make Pepper move forward
  • Wave-in and Wave-out gestures to make the robot turn left or right depending on what arm the armband is on.
  • Fist to make Pepper say “Hey!”
  • Double-tap to enable and disable the controls

I had never used the qiSDK’s motion API before, thus I followed the tutorials concerning it provided in the Dev Center’s website.

2.1 Async actions

As you may know, when actions are triggered, they are executed on a same working thread. It means that, when an action is executed, the thread will be blocked until the action is done. In order to make Pepper move or talk while still being able to recognise my gestures, I needed to execute all the robot’s actions asynchronously, on parallel working threads. To do so, I had to create an object of the Future type, which will be initiated with the actions I want it to trigger. I then had to run and cancel these Future actions at the right time in my app. Note that I created a single Future object to execute different actions, it makes it easier to start and stop it right and not get confused with which async task is running and which is not.

Here is an example, below, of the use of async actions to make Pepper say “Hey!” when I make the Fist stance while wearing Myo.
First, I created the Future object in the MainActivity’s variables


private var currentFuture:Future? = null

Then, I made sure than everytime a pose is detected, if the Future is initiated and running, it stops

override fun onPose(myo: Myo?, timestamp: Long, pose: Pose?) {

Finnaly I created an async “say” action in the onPose callback, when the pose is a fist, ran it and stored it in the currentFuture variable.

override fun onPose(myo: Myo?, timestamp: Long, pose: Pose?) {
    if (pose == Pose.FIST) {
            .andThenCompose { say ->
                currentFuture = say.async().run()

So, after understanding this concept of async actions, I started working on Pepper’s motion.

2.2 qiSDK’s goTo function

To make Pepper move forward, I used the goTo function. To move around in space, Pepper uses a virtual grid which has an origin and an orientation. So, basically, to make Pepper move forward 1 meter when I spread my fingers, inside the callback, I set the grid origin 1 meter in front of the robot and made it move toward that point. There are however two little issues concerning this method. The first one being that, when you want the robot to go forward, you have to rest and re-open your hand every meter to reactivate the callback. Easy way to fix it: making Pepper move 10 meters forward in the Fingers-spread callback and cancel the action in the Rest callback. That way Pepper just keeps moving forward as long as you keep the stance. The other small issue when making Pepper move forward is that the robot rarely goes a straight line, deviating from course to avoid obstacles (which is nice) but sometimes, Pepper detects “invisible obstacles” and tries to avoid them, starting to zig-zag around nothing… this is not a very concerning issue and there really is not much to be done about it. But, if you really want to be able to control Pepper’s movement without the robot avoiding obstacles on its own, you can use the animation API instead, but it is not really recommended...

2.3 qiSDK’s lookAt function

After I enabled Pepper to move forward, I had to allow the robot to turn. Therefore I used the lookAt function, which implementation is pretty similar to the goTo: setting a new origin for the frame, use lookAt to turn Pepper toward that point. Two issues with this function also. The first one is the same type of issue than with the goTo function, so I set the lookAt points behind Pepper, a bit on the right and left, respectively and made the lokkAt async action cancel in the Rest gesture callback. The other issue is about the Global-awareness of the robot. You may be familiar with the fact that Pepper, even when no app is running, turns towards you when you are close enough. Well, in the case of our app, this is pretty annoying, since, when you want Pepper to face a specific direction while being near, the movement will be cancelled because the robot will want to face you. This was pretty easy to handle as I simply had to deactivate the basic awareness when the controls were enabled. I had to create a basic awareness holder

private lateinit var basicAwarenessHolder:Holder

And asynchronously run and stop it when I respectively activate and deactivate the control (with a button or with the Double-tap pose)

As an example, my controlButton listener:

controlButton.setOnClickListener {
       controlButton.setText("Control: on")
   } else {
       controlButton.setText("Control: off")

In the end, I was very satisfied with the app I created. the gestures, when the armband was correctly worn, were well recognized by the robot and the response time between my movement and Pepper’s movement was surprisingly low. I had a lot of fun moving my robot along the office!

This was my last project for the API challenge, and it allowed me to test all the competences I acquired during my internship at SoftBank Robotics and to gain knowledge about the async actions, Pepper’s motions, bluetooth connection and callbacks. Even if you can’t get a Myo armband, you should try to use a remote or a gamepad to control Pepper, it is useful for your Android studio and qiSDK experience and pretty fun to realise!