DotWarp: Dynamic Object Timewarping for Video See-Through Augmented Reality

Peter Kim, Jason Orlosky, Kiyoshi Kiyokawa, Photchara Ratsamee, Tomohiro Mashita

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

A significant issue associated with the use of video see-through head-mounted displays (VST-HMD) for augmented reality is the presence of latency between real-world images and the images displayed to the HMD. For a static scene, this latency provides no real problem, however for dynamic scenes, which arise when the HMD user moves their head, when real-world objects move, or a combination of the two, the accompanying delay may result in significant registration error. To address this issue, we present DotWarp, a novel latency reduction technique for VST-HMDs that does not rely on head motion and compensates for the delay arising from real-world object motion. The algorithm requires a two-camera setup and matches dynamic objects in both images by tracking on the faster image and warping the pixels of the slower image, with the fast and slow components being RGB and IR components, respectively, for our system. First, moving objects are extracted from the faster camera scene using a motioncompensating background subtraction algorithm and tracked using a robust correlation tracker. Then, temporal correspondence between the two camera images is computed using sensor update information and the objects' positions in the slower image are shifted to match the corresponding positions in the faster image. Finally, the gaps in the slower image left behind by the shifted objects are filled in with background pixel data from previous frames using homography from the background subtraction model. In this manner, the augmented image is more closely matched with the real-world image and the perceived registration of the camera is significantly improved, with initial results of an 81.64% reduction in registration error.

Original languageEnglish (US)
Title of host publicationAdjunct Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017
EditorsWolfgang Broll, Holger Regenbrecht, Gerd Bruder, Myriam Servieres, Maki Sugimoto, J Edward Swan
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages184-185
Number of pages2
ISBN (Electronic)9780769563275
DOIs
StatePublished - Oct 27 2017
Externally publishedYes
Event16th Adjunct IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017 - Nantes, France
Duration: Oct 9 2017Oct 13 2017

Publication series

NameAdjunct Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017

Conference

Conference16th Adjunct IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2017
Country/TerritoryFrance
CityNantes
Period10/9/1710/13/17

Keywords

  • Augmented reality
  • Delay compensation
  • Timewarping

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Media Technology
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'DotWarp: Dynamic Object Timewarping for Video See-Through Augmented Reality'. Together they form a unique fingerprint.

Cite this