Eye Gaze-based Object Rotation for Head-mounted Displays

Chang Liu, Jason Orlosky, Alexander Plopski

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

Hands-free manipulation of 3D objects has long been a challenge for augmented and virtual reality (AR/VR). While many methods use eye gaze to assist with hand-based manipulations, interfaces cannot yet provide completely gaze-based 6 degree-of-freedom (DoF) manipulations in an efficient manner. To address this problem, we implemented three methods to handle rotations of virtual objects using gaze, including RotBar: a method that maps line-of-sight eye gaze onto per-axis rotations, RotPlane: a method that makes use of orthogonal planes to achieve per-axis angular rotations, and RotBall: a method that combines a traditional arcball with an external ring to handle user-perspective roll manipulations. We validated the efficiency of each method by conducting a user study involving a series of orientation tasks along different axes with each method. Experimental results showed that users could accomplish single-axis orientation tasks with RotBar and RotPlane significantly faster and more accurate than RotBall. On the other hand for multi-axis orientation tasks, RotBall significantly outperformed RotBar and RotPlane in terms of speed and accuracy.

Original languageEnglish (US)
Title of host publicationProceedings - SUI 2020
Subtitle of host publicationACM Symposium on Spatial User Interaction
EditorsStephen N. Spencer
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450379434
DOIs
StatePublished - Oct 31 2020
Externally publishedYes
Event6th ACM Symposium on Spatial User Interaction, SUI 2020 - Virtual, Online, Canada
Duration: Oct 31 2020Nov 1 2020

Publication series

NameProceedings - SUI 2020: ACM Symposium on Spatial User Interaction

Conference

Conference6th ACM Symposium on Spatial User Interaction, SUI 2020
Country/TerritoryCanada
CityVirtual, Online
Period10/31/2011/1/20

Keywords

  • eye gaze
  • head-mounted display
  • object rotation
  • user interface

ASJC Scopus subject areas

  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Eye Gaze-based Object Rotation for Head-mounted Displays'. Together they form a unique fingerprint.

Cite this