TY - JOUR
T1 - OrthoGaze
T2 - Gaze-based three-dimensional object manipulation using orthogonal planes
AU - Liu, Chang
AU - Plopski, Alexander
AU - Orlosky, Jason
N1 - Funding Information:
This work was funded in part by the United States Department of the Navy, Office of Naval Research, Grant #N62909–18–1–2036.
Funding Information:
This work was funded in part by the United States Department of the Navy, Office of Naval Research, Grant #N62909?18?1?2036.
Publisher Copyright:
© 2020 Elsevier Ltd
PY - 2020/6
Y1 - 2020/6
N2 - In virtual and augmented reality, gaze-based methods have been explored for decades as effective user interfaces for hands-free interaction. Though several well-known gaze-based methods exist for simple interactions such as selection, no solutions exist for 3D manipulation tasks requiring a higher degree of freedom (DoF). In this paper, we introduce OrthoGaze, a novel user interface that allows users to intuitively manipulate the three-dimensional position of a virtual object using only their eye or head gaze. Our approach makes use of three selectable, orthogonal planes, where each plane not only helps guide the user's gaze in an arbitrary virtual space, but also allows for 2-DoF manipulations of object position. To evaluate our method, we conducted two user studies involving aiming and docking tasks in virtual reality to evaluate the fundamental characteristics of sustained gaze aiming and to determine which type of gaze-based control performs best when combined with OrthoGaze. Results showed that eye gaze was more accurate than head gaze for sustained aiming. Additionally, eye and head gaze-based control for 3D manipulations achieved 78% and 96% performance, respectively, in comparison with a hand-held controller. Subjective results also suggest that gaze-based manipulation can comprehensively cause more fatigue than controller-based. From the experimental results, we expect OrthoGaze to become an effective method for pure hands-free object manipulation in head-mounted displays.
AB - In virtual and augmented reality, gaze-based methods have been explored for decades as effective user interfaces for hands-free interaction. Though several well-known gaze-based methods exist for simple interactions such as selection, no solutions exist for 3D manipulation tasks requiring a higher degree of freedom (DoF). In this paper, we introduce OrthoGaze, a novel user interface that allows users to intuitively manipulate the three-dimensional position of a virtual object using only their eye or head gaze. Our approach makes use of three selectable, orthogonal planes, where each plane not only helps guide the user's gaze in an arbitrary virtual space, but also allows for 2-DoF manipulations of object position. To evaluate our method, we conducted two user studies involving aiming and docking tasks in virtual reality to evaluate the fundamental characteristics of sustained gaze aiming and to determine which type of gaze-based control performs best when combined with OrthoGaze. Results showed that eye gaze was more accurate than head gaze for sustained aiming. Additionally, eye and head gaze-based control for 3D manipulations achieved 78% and 96% performance, respectively, in comparison with a hand-held controller. Subjective results also suggest that gaze-based manipulation can comprehensively cause more fatigue than controller-based. From the experimental results, we expect OrthoGaze to become an effective method for pure hands-free object manipulation in head-mounted displays.
KW - Eye tracking
KW - Human-computer interaction
KW - Object manipulation
KW - User interface
UR - http://www.scopus.com/inward/record.url?scp=85084676499&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084676499&partnerID=8YFLogxK
U2 - 10.1016/j.cag.2020.04.005
DO - 10.1016/j.cag.2020.04.005
M3 - Article
AN - SCOPUS:85084676499
VL - 89
SP - 1
EP - 10
JO - Computers and Graphics
JF - Computers and Graphics
SN - 0097-8493
ER -