Autonomous Abilities


  • Keep the robot alive at all times,
  • Let application developer focus on their very specific content, without micro-programming robot day-to-day behavior.
// Build the holder for the ability.
Holder holder = HolderBuilder.with(qiContext)

// Hold the ability asynchronously.

// Release the ability asynchronously.

Operating principle

Autonomous abilities are robot behaviors (movements, animations, tracking) occurring:

  • autonomously,
  • in the background,
  • taking the different resources of the robot weakly.

List of Autonomous Abilities

Autonomous Ability Type When
BackgroundMovement Passive Idle time
BasicAwareness Reactive Reacting to any kind of stimuli
AutonomousBlinking Reacting to human


The prioritization system works as follows:

By default, the robot will make background movements (aka. idle movements) and blink. If the robot detects a stimulus, then he may move his head and body, overriding the background movements. Finally, if a high-level behavior such as a Say or an Animate is executed, this will take priority over the previous modules.

Future-proof thanks to API levels

Thanks to API level mechanism, potential new autonomous abilities becoming available on the robot will not modify the current behaviour of your application: new abilities will be activated only if you rebuild with the new QiSDK version.

Tips & Tricks

  • If I want the robot to keep a posture (e.g. at the end of an animation), BackgroundMovement and BasicAwareness must be held so the robot can remain perfectly still.

  • If I want to freeze screen location while speaking or listening, I should disable body language.

    For further details see: Disabling body language in a Say and Disabling body language in a Listen.

    And if I use Chat and QiChatbot? Disable body language option:

    • of the Chat to freeze while listening, and
    • of the QiChatbot to freeze while speaking.

See also