Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet

Jared A. Frank, Vikram Kapila

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.

Original languageEnglish (US)
Title of host publication2016 Indian Control Conference, ICC 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages385-392
Number of pages8
ISBN (Print)9781467379939
DOIs
StatePublished - Mar 24 2016
Event2nd Indian Control Conference, ICC 2016 - Hyderabad, India
Duration: Jan 4 2016Jan 6 2016

Other

Other2nd Indian Control Conference, ICC 2016
CountryIndia
CityHyderabad
Period1/4/161/6/16

Fingerprint

Augmented reality
Remote control
Kinematics
Robots
End effectors
User interfaces
Gesture recognition
Facings
Graphical user interfaces
Computer vision
Education
Cameras
Feedback

ASJC Scopus subject areas

  • Control and Systems Engineering

Cite this

Frank, J. A., & Kapila, V. (2016). Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. In 2016 Indian Control Conference, ICC 2016 - Proceedings (pp. 385-392). [7441163] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/INDIANCC.2016.7441163

Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. / Frank, Jared A.; Kapila, Vikram.

2016 Indian Control Conference, ICC 2016 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2016. p. 385-392 7441163.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Frank, JA & Kapila, V 2016, Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. in 2016 Indian Control Conference, ICC 2016 - Proceedings., 7441163, Institute of Electrical and Electronics Engineers Inc., pp. 385-392, 2nd Indian Control Conference, ICC 2016, Hyderabad, India, 1/4/16. https://doi.org/10.1109/INDIANCC.2016.7441163
Frank JA, Kapila V. Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. In 2016 Indian Control Conference, ICC 2016 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2016. p. 385-392. 7441163 https://doi.org/10.1109/INDIANCC.2016.7441163
Frank, Jared A. ; Kapila, Vikram. / Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. 2016 Indian Control Conference, ICC 2016 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2016. pp. 385-392
@inproceedings{d5140bbe75144305bf387af53ccf7ed6,
title = "Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet",
abstract = "The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.",
author = "Frank, {Jared A.} and Vikram Kapila",
year = "2016",
month = "3",
day = "24",
doi = "10.1109/INDIANCC.2016.7441163",
language = "English (US)",
isbn = "9781467379939",
pages = "385--392",
booktitle = "2016 Indian Control Conference, ICC 2016 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet

AU - Frank, Jared A.

AU - Kapila, Vikram

PY - 2016/3/24

Y1 - 2016/3/24

N2 - The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.

AB - The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.

UR - http://www.scopus.com/inward/record.url?scp=84965167200&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84965167200&partnerID=8YFLogxK

U2 - 10.1109/INDIANCC.2016.7441163

DO - 10.1109/INDIANCC.2016.7441163

M3 - Conference contribution

AN - SCOPUS:84965167200

SN - 9781467379939

SP - 385

EP - 392

BT - 2016 Indian Control Conference, ICC 2016 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -