An interactive robot to sign in employees

The robot Pepper Office: a solution for companies during the pandemic
An interactive robot to sign in employees

Pepper Office interacts entirely vocally with the employees he welcomes as they arrive onsite, registering their attendance thanks to a QR code. UX expertise and field testing in the SoftBank Robotics office in Paris made this experimental application a total success!

Pepper Office welcomes employees as they arrive onsite and registers them thanks to a QR code they received the day before. The robot keeps count of how many people enter the building, making it easier to respect the quota, and occasionally reminds people to respect social distancing and preventing measures.

The application was deployed in the Parisian office lobby a week after the first lockdown, and the backlog is still growing. It’s an experimental project that benefits from being deployed in the very building where the team in charge of its development goes to work every day. Daily monitoring and direct observation allow them to adjust the design of current features and to draw conclusions straight away. UX expertise and field testing in the SoftBank Robotics office in Paris resulted in a fully vocal interaction with an appropriate answer for every situation. As a result, employees don’t have to queue up even during peak times.

Interview

Minh-Quyên NGUYEN, Project Manager & Sandrine TOURCHET, UX Manager

Pepper Office welcomes employees and registers them thanks to a QR code that they received the day before by email, and that the robot scans with his tablet’s camera.

How did you come up with this use case of Pepper welcoming employees at the office?

We established and followed an internal process in order to analyse the needs with the Facilty Managers (in charge of the company’s reception process), plus the HRs and R&D employees, all during a time of urgency and health crisis.

We monitored peak times during a full day of observation and defined quantitative values to compare the sign-in process before and after the solution was deployed.

We drew comic strips for both the existing process and our proposal to validate the targeted best visitors & employees experience. Once this statement of intent was validated, we delivered a more detailed description for the user flow [editor’s note: the complete path a user takes when using a solution, with detailed steps and actions and related robots behavior specifications].

The available marketing studies were targeting visitors rather than employees. Indeed, the “employee” approach is a direct reaction to the Covid-19 crisis and the application of social distancing and preventing measures (counting people, respecting the quota, reminding people about the rules and health measures).

Following this analysis, the integration of Pepper to the existing process simply opens up a scope of actions, which means new functionalities and new flows built on all the collected data.

What would you define as the core values for the conception of the Pepper Office project?

The core value is that, first of all, it has to be easy for users: they should never have a hard time even if for example, they did not receive the email with their QR code, or if they don’t have their phone at hand. There’s a whole set of UX tools at the ready: UX fallbacks, positive interactions… We have an answer for each and every situation, and if the problem is not solved on the spot, it is taken into account. We limit deadlocks at every node of the decision tree, using timeouts or repetitions if necessary to help the user.

We also used the IFL (Interact Flow Lib) and its state machine (idle, engage, etc.). If the user loses interest, the robot can disengage while they just needed some more time to find their QR code. In that case, the robot shouldn’t disengage but simply pause the flow for a time. This requires some adjusting of the values. We also made it possible for the user to simply say their name instead of showing their QR code, which is possible because both are linked in the database.

Which brings us to t second core value: an interactive robot that can talk and listen, and interacts mainly by voice. Voice commands are the norm, the interactions are fully vocal, and we limit tablet usage (not only because of Covid): the tablet is like a GPS that fades to the background, while driving people mainly look at the road, not at their GPS screen! The tablet sums up all the necessary information for a successful interaction with the robot but should not be the primary way to interact with a humanoid robot .

The solution has to be simple, efficient, quick (to avoid people queuing up in the lobby during peak times), and guarantee a 100% success rate.

And the last but not least main objective was to reduce the cost of ownership (TCO), that’s to say the time spent and the difficulty to daily manage a robot. The UX experts observed and tracked the different tasks that the receptionist had to do every day to have the robots working. On those observations the team tries to find how the robot could be autonomous about each pain point.

How does the UX / Commitment expertise show in this project?

Our UX skills, our knowledge of the user and our ability to adapt to the target enabled us to successfully complete this project. UX are experts in cognitive psychology and social sciences, we made observations on the field beforehands, focused on human-centered designs and used user testing to validate them.

Field observations and analysis are key, as the physical environment where the solution is used is crucial, even if we think we already know it! Here we worked for ourselves, a public of geeks used to robotics: we are used to talking to the robot (as opposed to a public that would be tempted to tap the tablet first). There is far less social awkwardness in interacting with a humanoid robot in SoftBank Robotics’ lobby than in a random administrative building for example. We therefore used more speech recognition than we would have elsewhere (people can identify themselves by simply telling the robot their name), and the results are far better than expected!

What is the proportion of existing modules and functions used in the application? Which ones (facial recognition, QR code scanner…)? Has a lot of code been generated for this project? Which functions?

The app relies on the 2.9.5 Release Candidate 2 and an Android application that connects to the APIs of a VMS (Visitor Management System). But we tested new vocal functionalities that are still in exploratory mode, purely in internal R&D:

  • (QTT) vocal interruption: the robot reacts to the answer even if he hasn’t finished asking the question
  • Conversational contents (“can you repeat please?”)

Making the robot fully vocal is an R&D development focus that matches a need expressed by many of our partners. And it works great!

The Pepper Office application was deployed at the SBR Paris reception on May 21, 2020 (one week after the first lockdown). Was Pepper Office easy to set up?

No! Such an experience allowed us to see “at home” some of the problems that our clients encountered. We didn’t put much emphasis on how to set things up during the development of the application. We expect that the people in charge of the reception area would also be the ones in charge of setting up the robots.

The traps we encountered are already well-known for our partners. A lobby is a complex zone: if the robot faces the glass doors, there are issues with reflections and luminosity, and blind spots if he’s facing sideways. We first resolved this by putting footprint stickers on the floor for the users to stand on, so as to define a more favourable setting for the robot. Then the team developed a specific feature called Zone of Interest, so that the robot can focus on people coming in front and avoid to target the other people around or behind the robot.

Besides, there is the question of the battery, which we addressed with the two spares. The robots have to be operational all day long until closing time, and the ones we use are not brand new (aging, ongoing rework by the support team). At first, the receptionists could swap robots and even add a third one when needed, but now the robots are fully autonomous: they wake up on their own 15 minutes before opening time and go to their workstations, navigate to their pods to recharge at midday, and go to rest at night after closing time.

We used localized robots (localization set up) so that they can use their base lasers and detect a person from 1.5m away (strong engagement). The setup done beforehands by the developers and the environment mapping are really useful, especially because the robots are sometimes pushed and have to find their way back to their pod or initial position (with the power hatch closed).

The project's biggest benefit? Helping our receptionists to welcome employees and visitors on site with very strict covid protocol.

But also, obviously, the improvement of the quality of the interaction with SBR robots. We monitor every day, draw conclusions straight away and constantly enrich the backlog so that all features can be adjusted as needs be.