Mixed-reality learning environments: Integrating mobile interfaces with laboratory test-beds

Jared A. Frank, Vikram Kapila

Research output: Contribution to journalArticle

Abstract

Even as mobile devices have become increasingly powerful and popular among learners and instructors alike, research involving their comprehensive integration into educational laboratory activities remains largely unexplored. This paper discusses efforts to integrate vision-based measurement and control, augmented reality (AR), and multi-touch interaction on mobile devices in the development of Mixed-Reality Learning Environments (MRLE) that enhance interactions with laboratory test-beds for science and engineering education. A learner points her device at a laboratory test-bed fitted with visual markers while a mobile application supplies a live view of the experiment augmented with interactive media that aid in the visualization of concepts and promote learner engagement. As the learner manipulates the augmented media, her gestures are mapped to commands that alter the behavior of the test-bed on the fly. Running in the background of the mobile application are algorithms performing vision-based estimation and wireless control of the test-bed. In this way, the sensing, storage, computation, and communication (SSCC) capabilities of mobile devices are leveraged to relieve the need for laboratory-grade equipment, improving the cost-effectiveness and portability of platforms to conduct hands-on laboratories. We hypothesize that students using the MRLE platform demonstrate improvement in their knowledge of dynamic systems and control concepts and have generally favorable experiences using the platform. To validate the hypotheses concerning the educational effectiveness and user experience of the MRLEs, an evaluation was conducted with two classes of undergraduate students using an illustrative platform incorporating a tablet computer and motor test-bed to teach concepts of dynamic systems and control. Results of the evaluation validate the hypotheses. The benefits and drawbacks of the MRLEs observed throughout the study are discussed with respect to the traditional hands-on, virtual, and remote laboratory formats.

Original languageEnglish (US)
Pages (from-to)88-104
Number of pages17
JournalComputers and Education
Volume110
DOIs
StatePublished - Jul 1 2017

Fingerprint

learning environment
Mobile devices
Dynamical systems
Students
Augmented reality
interactive media
Engineering education
Cost effectiveness
interaction
evaluation
visualization
instructor
experience
Visualization
student
engineering
communication
Communication
experiment
costs

Keywords

  • Applications in subject areas
  • Architectures for educational technology system
  • Improving classroom teaching
  • Interactive learning environments
  • Virtual reality

ASJC Scopus subject areas

  • Computer Science(all)
  • Education

Cite this

Mixed-reality learning environments : Integrating mobile interfaces with laboratory test-beds. / Frank, Jared A.; Kapila, Vikram.

In: Computers and Education, Vol. 110, 01.07.2017, p. 88-104.

Research output: Contribution to journalArticle

@article{b7d979f4351c475197d3becc0f511c6b,
title = "Mixed-reality learning environments: Integrating mobile interfaces with laboratory test-beds",
abstract = "Even as mobile devices have become increasingly powerful and popular among learners and instructors alike, research involving their comprehensive integration into educational laboratory activities remains largely unexplored. This paper discusses efforts to integrate vision-based measurement and control, augmented reality (AR), and multi-touch interaction on mobile devices in the development of Mixed-Reality Learning Environments (MRLE) that enhance interactions with laboratory test-beds for science and engineering education. A learner points her device at a laboratory test-bed fitted with visual markers while a mobile application supplies a live view of the experiment augmented with interactive media that aid in the visualization of concepts and promote learner engagement. As the learner manipulates the augmented media, her gestures are mapped to commands that alter the behavior of the test-bed on the fly. Running in the background of the mobile application are algorithms performing vision-based estimation and wireless control of the test-bed. In this way, the sensing, storage, computation, and communication (SSCC) capabilities of mobile devices are leveraged to relieve the need for laboratory-grade equipment, improving the cost-effectiveness and portability of platforms to conduct hands-on laboratories. We hypothesize that students using the MRLE platform demonstrate improvement in their knowledge of dynamic systems and control concepts and have generally favorable experiences using the platform. To validate the hypotheses concerning the educational effectiveness and user experience of the MRLEs, an evaluation was conducted with two classes of undergraduate students using an illustrative platform incorporating a tablet computer and motor test-bed to teach concepts of dynamic systems and control. Results of the evaluation validate the hypotheses. The benefits and drawbacks of the MRLEs observed throughout the study are discussed with respect to the traditional hands-on, virtual, and remote laboratory formats.",
keywords = "Applications in subject areas, Architectures for educational technology system, Improving classroom teaching, Interactive learning environments, Virtual reality",
author = "Frank, {Jared A.} and Vikram Kapila",
year = "2017",
month = "7",
day = "1",
doi = "10.1016/j.compedu.2017.02.009",
language = "English (US)",
volume = "110",
pages = "88--104",
journal = "Computers and Education",
issn = "0360-1315",
publisher = "Elsevier Limited",

}

TY - JOUR

T1 - Mixed-reality learning environments

T2 - Integrating mobile interfaces with laboratory test-beds

AU - Frank, Jared A.

AU - Kapila, Vikram

PY - 2017/7/1

Y1 - 2017/7/1

N2 - Even as mobile devices have become increasingly powerful and popular among learners and instructors alike, research involving their comprehensive integration into educational laboratory activities remains largely unexplored. This paper discusses efforts to integrate vision-based measurement and control, augmented reality (AR), and multi-touch interaction on mobile devices in the development of Mixed-Reality Learning Environments (MRLE) that enhance interactions with laboratory test-beds for science and engineering education. A learner points her device at a laboratory test-bed fitted with visual markers while a mobile application supplies a live view of the experiment augmented with interactive media that aid in the visualization of concepts and promote learner engagement. As the learner manipulates the augmented media, her gestures are mapped to commands that alter the behavior of the test-bed on the fly. Running in the background of the mobile application are algorithms performing vision-based estimation and wireless control of the test-bed. In this way, the sensing, storage, computation, and communication (SSCC) capabilities of mobile devices are leveraged to relieve the need for laboratory-grade equipment, improving the cost-effectiveness and portability of platforms to conduct hands-on laboratories. We hypothesize that students using the MRLE platform demonstrate improvement in their knowledge of dynamic systems and control concepts and have generally favorable experiences using the platform. To validate the hypotheses concerning the educational effectiveness and user experience of the MRLEs, an evaluation was conducted with two classes of undergraduate students using an illustrative platform incorporating a tablet computer and motor test-bed to teach concepts of dynamic systems and control. Results of the evaluation validate the hypotheses. The benefits and drawbacks of the MRLEs observed throughout the study are discussed with respect to the traditional hands-on, virtual, and remote laboratory formats.

AB - Even as mobile devices have become increasingly powerful and popular among learners and instructors alike, research involving their comprehensive integration into educational laboratory activities remains largely unexplored. This paper discusses efforts to integrate vision-based measurement and control, augmented reality (AR), and multi-touch interaction on mobile devices in the development of Mixed-Reality Learning Environments (MRLE) that enhance interactions with laboratory test-beds for science and engineering education. A learner points her device at a laboratory test-bed fitted with visual markers while a mobile application supplies a live view of the experiment augmented with interactive media that aid in the visualization of concepts and promote learner engagement. As the learner manipulates the augmented media, her gestures are mapped to commands that alter the behavior of the test-bed on the fly. Running in the background of the mobile application are algorithms performing vision-based estimation and wireless control of the test-bed. In this way, the sensing, storage, computation, and communication (SSCC) capabilities of mobile devices are leveraged to relieve the need for laboratory-grade equipment, improving the cost-effectiveness and portability of platforms to conduct hands-on laboratories. We hypothesize that students using the MRLE platform demonstrate improvement in their knowledge of dynamic systems and control concepts and have generally favorable experiences using the platform. To validate the hypotheses concerning the educational effectiveness and user experience of the MRLEs, an evaluation was conducted with two classes of undergraduate students using an illustrative platform incorporating a tablet computer and motor test-bed to teach concepts of dynamic systems and control. Results of the evaluation validate the hypotheses. The benefits and drawbacks of the MRLEs observed throughout the study are discussed with respect to the traditional hands-on, virtual, and remote laboratory formats.

KW - Applications in subject areas

KW - Architectures for educational technology system

KW - Improving classroom teaching

KW - Interactive learning environments

KW - Virtual reality

UR - http://www.scopus.com/inward/record.url?scp=85016059169&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85016059169&partnerID=8YFLogxK

U2 - 10.1016/j.compedu.2017.02.009

DO - 10.1016/j.compedu.2017.02.009

M3 - Article

VL - 110

SP - 88

EP - 104

JO - Computers and Education

JF - Computers and Education

SN - 0360-1315

ER -