TY - GEN
T1 - Monocular focus estimation method for a freely-orienting eye using Purkinje-Sanson images
AU - Itoh, Yuta
AU - Orlosky, Jason
AU - Kiyokawa, Kiyoshi
AU - Amano, Toshiyuki
AU - Sugimoto, Maki
N1 - Funding Information:
This work was supported by CREST JST and JSPS Grant-in-Aid for Scientific Research (B) 15H02738.
Publisher Copyright:
© 2017 IEEE.
PY - 2017/4/4
Y1 - 2017/4/4
N2 - We present a method for focal distance estimation of a freely-orienting eye using Purkinje-Sanson (PS) images, which are reflections of light on the inner structures of the eye. Using an infrared camera with a rigidly-fixed LED, our method creates an estimation model based on 3D gaze and the distance between reflections in the PS images that occur on the corneal surface and anterior surface of the eye lens. The distance between these two reflections changes with focus, so we associate that information to the focal distance on a user. Unlike conventional methods that mainly relies on 2D pupil size which is sensitive to scene lighting and the fourth PS image, our method detects the third PS image which is more representative of accommodation. Our feasibility study on a single user with a focal range from 15-45 cm shows that our method achieves mean and median absolute errors of 3.15 and 1.93 cm for a 10-degree viewing angle. The study shows that our method is also tolerant against environment lighting changes.
AB - We present a method for focal distance estimation of a freely-orienting eye using Purkinje-Sanson (PS) images, which are reflections of light on the inner structures of the eye. Using an infrared camera with a rigidly-fixed LED, our method creates an estimation model based on 3D gaze and the distance between reflections in the PS images that occur on the corneal surface and anterior surface of the eye lens. The distance between these two reflections changes with focus, so we associate that information to the focal distance on a user. Unlike conventional methods that mainly relies on 2D pupil size which is sensitive to scene lighting and the fourth PS image, our method detects the third PS image which is more representative of accommodation. Our feasibility study on a single user with a focal range from 15-45 cm shows that our method achieves mean and median absolute errors of 3.15 and 1.93 cm for a 10-degree viewing angle. The study shows that our method is also tolerant against environment lighting changes.
KW - Augmented
KW - H.5.1 [Information interfaces and presentation]: Multimedia information systems - Artificial
KW - Virtual realities
UR - http://www.scopus.com/inward/record.url?scp=85018401858&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85018401858&partnerID=8YFLogxK
U2 - 10.1109/VR.2017.7892252
DO - 10.1109/VR.2017.7892252
M3 - Conference contribution
AN - SCOPUS:85018401858
T3 - Proceedings - IEEE Virtual Reality
SP - 213
EP - 214
BT - 2017 IEEE Virtual Reality, VR 2017 - Proceedings
PB - IEEE Computer Society
T2 - 19th IEEE Virtual Reality, VR 2017
Y2 - 18 March 2017 through 22 March 2017
ER -