Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs

Juan Simon Calle, Agnieszka Roginska

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A game was created to analyze the subject’s head rotation to give accurate data about the process of localizing a sound in a 360-degree sphere in a VR gameplay. In this game, the subjects are asked to locate a series of sounds that are randomly placed in a sphere around their heads using generalized HRTFs. The only instruction given to the subjects is that they need to locate the sounds as fast and accurate as possible by looking at where the sound was and then pressing a trigger. To test this tool, 16 subjects were used. It showed that the average time that it took the subjects to locate the sound was 3.7±1.8 seconds. The average error in accuracy was 15.4 degrees. The average time that it took the subjects to start moving their head was 0.2 seconds approximately. The average rotation speed achieved its maximum value at 0.8 seconds and the average speed at this point was approximately 102 degrees per second.

Original languageEnglish (US)
Title of host publication143rd Audio Engineering Society Convention 2017, AES 2017
PublisherAudio Engineering Society
Pages718-726
Number of pages9
Volume2
ISBN (Electronic)9781510870703
StatePublished - Jan 1 2017
Event143rd Audio Engineering Society Convention 2017, AES 2017 - New York, United States
Duration: Oct 18 2017Oct 20 2017

Other

Other143rd Audio Engineering Society Convention 2017, AES 2017
CountryUnited States
CityNew York
Period10/18/1710/20/17

Fingerprint

virtual reality
Virtual Reality
Virtual reality
Acoustic waves
acoustics
games
Time-average
Game
pressing
Trigger
education
actuators
Sound
Series

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Modeling and Simulation
  • Acoustics and Ultrasonics

Cite this

Calle, J. S., & Roginska, A. (2017). Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs. In 143rd Audio Engineering Society Convention 2017, AES 2017 (Vol. 2, pp. 718-726). Audio Engineering Society.

Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs. / Calle, Juan Simon; Roginska, Agnieszka.

143rd Audio Engineering Society Convention 2017, AES 2017. Vol. 2 Audio Engineering Society, 2017. p. 718-726.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Calle, JS & Roginska, A 2017, Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs. in 143rd Audio Engineering Society Convention 2017, AES 2017. vol. 2, Audio Engineering Society, pp. 718-726, 143rd Audio Engineering Society Convention 2017, AES 2017, New York, United States, 10/18/17.
Calle JS, Roginska A. Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs. In 143rd Audio Engineering Society Convention 2017, AES 2017. Vol. 2. Audio Engineering Society. 2017. p. 718-726
Calle, Juan Simon ; Roginska, Agnieszka. / Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs. 143rd Audio Engineering Society Convention 2017, AES 2017. Vol. 2 Audio Engineering Society, 2017. pp. 718-726
@inproceedings{24d65ccf203a47249b571abf2f02c88a,
title = "Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs",
abstract = "A game was created to analyze the subject’s head rotation to give accurate data about the process of localizing a sound in a 360-degree sphere in a VR gameplay. In this game, the subjects are asked to locate a series of sounds that are randomly placed in a sphere around their heads using generalized HRTFs. The only instruction given to the subjects is that they need to locate the sounds as fast and accurate as possible by looking at where the sound was and then pressing a trigger. To test this tool, 16 subjects were used. It showed that the average time that it took the subjects to locate the sound was 3.7±1.8 seconds. The average error in accuracy was 15.4 degrees. The average time that it took the subjects to start moving their head was 0.2 seconds approximately. The average rotation speed achieved its maximum value at 0.8 seconds and the average speed at this point was approximately 102 degrees per second.",
author = "Calle, {Juan Simon} and Agnieszka Roginska",
year = "2017",
month = "1",
day = "1",
language = "English (US)",
volume = "2",
pages = "718--726",
booktitle = "143rd Audio Engineering Society Convention 2017, AES 2017",
publisher = "Audio Engineering Society",
address = "United States",

}

TY - GEN

T1 - Head rotation data extraction from virtual reality gameplay using non-individualized HRTFs

AU - Calle, Juan Simon

AU - Roginska, Agnieszka

PY - 2017/1/1

Y1 - 2017/1/1

N2 - A game was created to analyze the subject’s head rotation to give accurate data about the process of localizing a sound in a 360-degree sphere in a VR gameplay. In this game, the subjects are asked to locate a series of sounds that are randomly placed in a sphere around their heads using generalized HRTFs. The only instruction given to the subjects is that they need to locate the sounds as fast and accurate as possible by looking at where the sound was and then pressing a trigger. To test this tool, 16 subjects were used. It showed that the average time that it took the subjects to locate the sound was 3.7±1.8 seconds. The average error in accuracy was 15.4 degrees. The average time that it took the subjects to start moving their head was 0.2 seconds approximately. The average rotation speed achieved its maximum value at 0.8 seconds and the average speed at this point was approximately 102 degrees per second.

AB - A game was created to analyze the subject’s head rotation to give accurate data about the process of localizing a sound in a 360-degree sphere in a VR gameplay. In this game, the subjects are asked to locate a series of sounds that are randomly placed in a sphere around their heads using generalized HRTFs. The only instruction given to the subjects is that they need to locate the sounds as fast and accurate as possible by looking at where the sound was and then pressing a trigger. To test this tool, 16 subjects were used. It showed that the average time that it took the subjects to locate the sound was 3.7±1.8 seconds. The average error in accuracy was 15.4 degrees. The average time that it took the subjects to start moving their head was 0.2 seconds approximately. The average rotation speed achieved its maximum value at 0.8 seconds and the average speed at this point was approximately 102 degrees per second.

UR - http://www.scopus.com/inward/record.url?scp=85059000628&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85059000628&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85059000628

VL - 2

SP - 718

EP - 726

BT - 143rd Audio Engineering Society Convention 2017, AES 2017

PB - Audio Engineering Society

ER -