Real-time occlusion handling for dynamic augmented reality using geometric sensing and graphical shading

Suyang Dong, Chen Feng, Vineet R. Kamat

Research output: Contribution to journalArticle

Abstract

The primary challenge in generating convincing augmented reality (AR) graphics is to project three-dimensional (3D) models onto a user's view of the real world and create a temporal and spatial sustained illusion that the virtual and real objects coexist. Regardless of the spatial relationship between the real and virtual objects, traditional AR graphical engines break the illusion of coexistence by displaying the real world merely as a background and superimposing virtual objects on the foreground. This research proposes a robust depth-sensing and frame buffer algorithm for handling occlusion problems in ubiquitous AR applications. A high-accuracy time-of-flight (TOF) camera is used to capture the depth map of the real world in real time. The distance information is processed in parallel using the OpenGL shading language (GLSL) and render to texture (RTT) techniques. The final processing results are written to the graphics frame buffers, allowing accurate depth resolution and hidden surface removal in composite AR scenes. The designed algorithm is validated in several indoor and outdoor experiments using the scalable and modular augmented reality template (SMART) AR framework.

Original languageEnglish (US)
Pages (from-to)607-621
Number of pages15
JournalJournal of Computing in Civil Engineering
Volume27
Issue number6
DOIs
StatePublished - Jan 1 2013

Fingerprint

Augmented reality
Textures
Cameras
Engines
Composite materials
Processing
Experiments

Keywords

  • Graphical shading
  • Homography mapping
  • Render to texture
  • Stereo mapping
  • Time-of-flight camera
  • Visual simulation

ASJC Scopus subject areas

  • Civil and Structural Engineering
  • Computer Science Applications

Cite this

Real-time occlusion handling for dynamic augmented reality using geometric sensing and graphical shading. / Dong, Suyang; Feng, Chen; Kamat, Vineet R.

In: Journal of Computing in Civil Engineering, Vol. 27, No. 6, 01.01.2013, p. 607-621.

Research output: Contribution to journalArticle

@article{67fbe9ccab744e44a56e7b5ce6006e36,
title = "Real-time occlusion handling for dynamic augmented reality using geometric sensing and graphical shading",
abstract = "The primary challenge in generating convincing augmented reality (AR) graphics is to project three-dimensional (3D) models onto a user's view of the real world and create a temporal and spatial sustained illusion that the virtual and real objects coexist. Regardless of the spatial relationship between the real and virtual objects, traditional AR graphical engines break the illusion of coexistence by displaying the real world merely as a background and superimposing virtual objects on the foreground. This research proposes a robust depth-sensing and frame buffer algorithm for handling occlusion problems in ubiquitous AR applications. A high-accuracy time-of-flight (TOF) camera is used to capture the depth map of the real world in real time. The distance information is processed in parallel using the OpenGL shading language (GLSL) and render to texture (RTT) techniques. The final processing results are written to the graphics frame buffers, allowing accurate depth resolution and hidden surface removal in composite AR scenes. The designed algorithm is validated in several indoor and outdoor experiments using the scalable and modular augmented reality template (SMART) AR framework.",
keywords = "Graphical shading, Homography mapping, Render to texture, Stereo mapping, Time-of-flight camera, Visual simulation",
author = "Suyang Dong and Chen Feng and Kamat, {Vineet R.}",
year = "2013",
month = "1",
day = "1",
doi = "10.1061/(ASCE)CP.1943-5487.0000278",
language = "English (US)",
volume = "27",
pages = "607--621",
journal = "Journal of Computing in Civil Engineering",
issn = "0887-3801",
publisher = "American Society of Civil Engineers (ASCE)",
number = "6",

}

TY - JOUR

T1 - Real-time occlusion handling for dynamic augmented reality using geometric sensing and graphical shading

AU - Dong, Suyang

AU - Feng, Chen

AU - Kamat, Vineet R.

PY - 2013/1/1

Y1 - 2013/1/1

N2 - The primary challenge in generating convincing augmented reality (AR) graphics is to project three-dimensional (3D) models onto a user's view of the real world and create a temporal and spatial sustained illusion that the virtual and real objects coexist. Regardless of the spatial relationship between the real and virtual objects, traditional AR graphical engines break the illusion of coexistence by displaying the real world merely as a background and superimposing virtual objects on the foreground. This research proposes a robust depth-sensing and frame buffer algorithm for handling occlusion problems in ubiquitous AR applications. A high-accuracy time-of-flight (TOF) camera is used to capture the depth map of the real world in real time. The distance information is processed in parallel using the OpenGL shading language (GLSL) and render to texture (RTT) techniques. The final processing results are written to the graphics frame buffers, allowing accurate depth resolution and hidden surface removal in composite AR scenes. The designed algorithm is validated in several indoor and outdoor experiments using the scalable and modular augmented reality template (SMART) AR framework.

AB - The primary challenge in generating convincing augmented reality (AR) graphics is to project three-dimensional (3D) models onto a user's view of the real world and create a temporal and spatial sustained illusion that the virtual and real objects coexist. Regardless of the spatial relationship between the real and virtual objects, traditional AR graphical engines break the illusion of coexistence by displaying the real world merely as a background and superimposing virtual objects on the foreground. This research proposes a robust depth-sensing and frame buffer algorithm for handling occlusion problems in ubiquitous AR applications. A high-accuracy time-of-flight (TOF) camera is used to capture the depth map of the real world in real time. The distance information is processed in parallel using the OpenGL shading language (GLSL) and render to texture (RTT) techniques. The final processing results are written to the graphics frame buffers, allowing accurate depth resolution and hidden surface removal in composite AR scenes. The designed algorithm is validated in several indoor and outdoor experiments using the scalable and modular augmented reality template (SMART) AR framework.

KW - Graphical shading

KW - Homography mapping

KW - Render to texture

KW - Stereo mapping

KW - Time-of-flight camera

KW - Visual simulation

UR - http://www.scopus.com/inward/record.url?scp=84886565287&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84886565287&partnerID=8YFLogxK

U2 - 10.1061/(ASCE)CP.1943-5487.0000278

DO - 10.1061/(ASCE)CP.1943-5487.0000278

M3 - Article

VL - 27

SP - 607

EP - 621

JO - Journal of Computing in Civil Engineering

JF - Journal of Computing in Civil Engineering

SN - 0887-3801

IS - 6

ER -