Mobile mixed-reality interfaces that enhance human-robot interaction in shared spaces

Jared A. Frank, Matthew Moorhead, Vikram Kapila

Research output: Contribution to journalArticle

Abstract

Although user interfaces with gesture-based input and augmented graphics have promoted intuitive human-robot interactions (HRI), they are often implemented in remote applications on research-grade platforms requiring significant training and limiting operator mobility. This paper proposes a mobile mixed-reality interface approach to enhance HRI in shared spaces. As a user points a mobile device at the robot's workspace, a mixed-reality environment is rendered providing a common frame of reference for the user and robot to effectively communicate spatial information for performing object manipulation tasks, improving the user's situational awareness while interacting with augmented graphics to intuitively command the robot. An evaluation with participants is conducted to examine task performance and user experience associated with the proposed interface strategy in comparison to conventional approaches that utilize egocentric or exocentric views from cameras mounted on the robot or in the environment, respectively. Results indicate that, despite the suitability of the conventional approaches in remote applications, the proposed interface approach provides comparable task performance and user experiences in shared spaces without the need to install operator stations or vision systems on or around the robot. Moreover, the proposed interface approach provides users the flexibility to direct robots from their own visual perspective (at the expense of some physical workload) and leverages the sensing capabilities of the tablet to expand the robot's perceptual range.

Original languageEnglish (US)
Article number20
JournalFrontiers Robotics AI
Volume4
Issue numberJUN
DOIs
StatePublished - Jun 1 2017

Fingerprint

Human robot interaction
Robots
Mobile devices
User interfaces
Cameras

Keywords

  • Interaction
  • Interface
  • Manipulation
  • Mixed-reality
  • Robotics
  • Tablet
  • Vision
  • Workspace

ASJC Scopus subject areas

  • Computer Science Applications
  • Artificial Intelligence

Cite this

Mobile mixed-reality interfaces that enhance human-robot interaction in shared spaces. / Frank, Jared A.; Moorhead, Matthew; Kapila, Vikram.

In: Frontiers Robotics AI, Vol. 4, No. JUN, 20, 01.06.2017.

Research output: Contribution to journalArticle

@article{818680a4c1de4baeb82bba75a4dd7e83,
title = "Mobile mixed-reality interfaces that enhance human-robot interaction in shared spaces",
abstract = "Although user interfaces with gesture-based input and augmented graphics have promoted intuitive human-robot interactions (HRI), they are often implemented in remote applications on research-grade platforms requiring significant training and limiting operator mobility. This paper proposes a mobile mixed-reality interface approach to enhance HRI in shared spaces. As a user points a mobile device at the robot's workspace, a mixed-reality environment is rendered providing a common frame of reference for the user and robot to effectively communicate spatial information for performing object manipulation tasks, improving the user's situational awareness while interacting with augmented graphics to intuitively command the robot. An evaluation with participants is conducted to examine task performance and user experience associated with the proposed interface strategy in comparison to conventional approaches that utilize egocentric or exocentric views from cameras mounted on the robot or in the environment, respectively. Results indicate that, despite the suitability of the conventional approaches in remote applications, the proposed interface approach provides comparable task performance and user experiences in shared spaces without the need to install operator stations or vision systems on or around the robot. Moreover, the proposed interface approach provides users the flexibility to direct robots from their own visual perspective (at the expense of some physical workload) and leverages the sensing capabilities of the tablet to expand the robot's perceptual range.",
keywords = "Interaction, Interface, Manipulation, Mixed-reality, Robotics, Tablet, Vision, Workspace",
author = "Frank, {Jared A.} and Matthew Moorhead and Vikram Kapila",
year = "2017",
month = "6",
day = "1",
doi = "10.3389/frobt.2017.00020",
language = "English (US)",
volume = "4",
journal = "Frontiers Robotics AI",
issn = "2296-9144",
publisher = "Frontiers Media S. A.",
number = "JUN",

}

TY - JOUR

T1 - Mobile mixed-reality interfaces that enhance human-robot interaction in shared spaces

AU - Frank, Jared A.

AU - Moorhead, Matthew

AU - Kapila, Vikram

PY - 2017/6/1

Y1 - 2017/6/1

N2 - Although user interfaces with gesture-based input and augmented graphics have promoted intuitive human-robot interactions (HRI), they are often implemented in remote applications on research-grade platforms requiring significant training and limiting operator mobility. This paper proposes a mobile mixed-reality interface approach to enhance HRI in shared spaces. As a user points a mobile device at the robot's workspace, a mixed-reality environment is rendered providing a common frame of reference for the user and robot to effectively communicate spatial information for performing object manipulation tasks, improving the user's situational awareness while interacting with augmented graphics to intuitively command the robot. An evaluation with participants is conducted to examine task performance and user experience associated with the proposed interface strategy in comparison to conventional approaches that utilize egocentric or exocentric views from cameras mounted on the robot or in the environment, respectively. Results indicate that, despite the suitability of the conventional approaches in remote applications, the proposed interface approach provides comparable task performance and user experiences in shared spaces without the need to install operator stations or vision systems on or around the robot. Moreover, the proposed interface approach provides users the flexibility to direct robots from their own visual perspective (at the expense of some physical workload) and leverages the sensing capabilities of the tablet to expand the robot's perceptual range.

AB - Although user interfaces with gesture-based input and augmented graphics have promoted intuitive human-robot interactions (HRI), they are often implemented in remote applications on research-grade platforms requiring significant training and limiting operator mobility. This paper proposes a mobile mixed-reality interface approach to enhance HRI in shared spaces. As a user points a mobile device at the robot's workspace, a mixed-reality environment is rendered providing a common frame of reference for the user and robot to effectively communicate spatial information for performing object manipulation tasks, improving the user's situational awareness while interacting with augmented graphics to intuitively command the robot. An evaluation with participants is conducted to examine task performance and user experience associated with the proposed interface strategy in comparison to conventional approaches that utilize egocentric or exocentric views from cameras mounted on the robot or in the environment, respectively. Results indicate that, despite the suitability of the conventional approaches in remote applications, the proposed interface approach provides comparable task performance and user experiences in shared spaces without the need to install operator stations or vision systems on or around the robot. Moreover, the proposed interface approach provides users the flexibility to direct robots from their own visual perspective (at the expense of some physical workload) and leverages the sensing capabilities of the tablet to expand the robot's perceptual range.

KW - Interaction

KW - Interface

KW - Manipulation

KW - Mixed-reality

KW - Robotics

KW - Tablet

KW - Vision

KW - Workspace

UR - http://www.scopus.com/inward/record.url?scp=85050600487&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85050600487&partnerID=8YFLogxK

U2 - 10.3389/frobt.2017.00020

DO - 10.3389/frobt.2017.00020

M3 - Article

AN - SCOPUS:85050600487

VL - 4

JO - Frontiers Robotics AI

JF - Frontiers Robotics AI

SN - 2296-9144

IS - JUN

M1 - 20

ER -