VisMerge: Light adaptive vision augmentation via spectral and temporal fusion of non-visible light

Jason Orlosky, Peter Kim, Kiyoshi Kiyokawa, Tomohiro Mashita, Photchara Ratsamee, Yuki Uranishi, Haruo Takemura

Research output: Chapter in Book/Report/Conference proceedingConference contribution

17 Scopus citations

Abstract

Low light situations pose a significant challenge to individuals working in a variety of different fields such as firefighting, rescue, maintenance and medicine. Tools like flashlights and infrared (IR) cameras have been used to augment light in the past, but they must often be operated manually, provide a field of view that is decoupled from the operator's own view, and utilize color schemes that can occlude content from the original scene. To help address these issues, we present VisMerge, a framework that combines a thermal imaging head mounted display (HMD) and algorithms that temporally and spectrally merge video streams of different light bands into the same field of view. For temporal synchronization, we first develop a variant of the time warping algorithm used in virtual reality (VR), but redesign it to merge video see-through (VST) cameras with different latencies. Next, using computer vision and image compositing we develop five new algorithms designed to merge non-uniform video streams from a standard RGB camera and small form-factor infrared (IR) camera. We then implement six other existing fusion methods, and conduct a series of comparative experiments, including a system level analysis of the augmented reality (AR) time warping algorithm, a pilot experiment to test perceptual consistency across all eleven merging algorithms, and an in-depth experiment on performance testing the top algorithms in a VR (simulated AR) search task. Results showed that we can reduce temporal registration error due to inter-camera latency by an average of 87.04%, that the wavelet and inverse stipple algorithms were perceptually rated the highest, that noise modulation performed best, and that freedom of user movement is significantly increased with visualizations engaged.

Original languageEnglish (US)
Title of host publicationProceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2017
EditorsWolfgang Broll, Holger Regenbrecht, J Edward Swan
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages22-31
Number of pages10
ISBN (Electronic)9781538629437
DOIs
StatePublished - Nov 20 2017
Externally publishedYes
Event16th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2017 - Nantes, France
Duration: Oct 9 2017Oct 13 2017

Publication series

NameProceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2017

Conference

Conference16th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2017
Country/TerritoryFrance
CityNantes
Period10/9/1710/13/17

Keywords

  • Augmented reality
  • Image fusion
  • Infrared
  • Timewarping
  • Vision augmentation

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Media Technology
  • Modeling and Simulation

Fingerprint

Dive into the research topics of 'VisMerge: Light adaptive vision augmentation via spectral and temporal fusion of non-visible light'. Together they form a unique fingerprint.

Cite this