ALWavingDetection

NAOqi People Perception - Overview | API


What it does

ALWavingDetection allows you to detect a person trying to catch the robot’s attention in the field of view of the SoftBank Robotics robots.

The detection is better if the robot has a 3d sensor.

How it works

According to simuli coming from ALPeoplePerception , ALMovementDetection and frames collected at a regular interval. It is possible to detect and determine if a people is currently waving at the robot. Using some filtering algorithms, only meaningful movements are kept. For example, a person is considered as waving to the robot only if he/she is looking at the robot and if his whole body is not moving.

For robot with a 3d camera, it is possible to filter the detected wavings depending on the size and the distance of the movement, using the parameters MinSize and MaxDistance. If a waving has a size smaller than MinSize or at a distance bigger than MaxDistance, it will not be detected. The theshold values for these two parameters can be changed with the functions ALWavingDetectionProxy::setMinSize and ALWavingDetectionProxy::setMaxDistance .

For robot whithout 3d camera, is it possible to change the threshold for the detection using: ALWavingDetectionProxy::setThreshold . Currently on 2d it’s only possible to detect close waving detection.

ALMemory event

Each time some waving are detected, an ALMemory event WavingDetection/Waving , is raised. The ALValue WavingInfo is created and attached to the event.

The ALValue contains the information about the different clusters of “waving” pixels. It is organized as follows:


             
              WavingInfo =
[
  TimeStamp,
  [ClusterInfo_1, ClusterInfo_2, ... ClusterInfo_n],
]

             
            

TimeStamp : this field is the time stamp of the image that was used to perform the detection.


             
              TimeStamp [
  TimeStamp_Seconds,
  Timestamp_Microseconds
]

             
            

ClusterInfo_i : each of these fields contains the description of a “waving” cluster. It has the following structure:


             
              ClusterInfo_i =
[
  Confidence,
  WavingType,
  PositionOfWaving,
  HumanId
]

             
            

All waving

  • Confidence contains the detection confidence (normalized float in [0,1]).
  • WavingType contains the type of waving detected.
Waving Type (String) Remark
CloseWaving

PositionOfWaving and HumanId are not available.

2d 2d sensor only

WavingLeft 3d 3d sensor only
WavingRight 3d 3d sensor only
WavingCenter 3d 3d sensor only
  • PositionOfWaving = [x,y] contains the angular coordinates (in radians) of the center of gravity of the cluster with respect to the center of the used camera.
  • HumanId ID of the person (provided by ALPeopleDetection ).

Performances and limitations

The algorithm used for movement detection only works if the camera is not moving. Therefore, when the robot is moving, the detection is automatically disabled: the events are not raised.