TY - GEN
T1 - Toward Parallel Consciousness
T2 - 2017 International Symposium on Ubiquitous Virtual Reality, ISUVR 2017
AU - Orlosky, Jason
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2017/7/21
Y1 - 2017/7/21
N2 - In recent years, see-through display technology has begun to reach a point where we will have the ability to continuously display virtual information in a variety of real world situations. However, augmented reality (AR) interfaces are currently limited in their ability to interact with the wearer and environment to provide specific, safe, and useful information when needed. Moreover, many questions remain about how to make content more relevant, especially in dynamic applications like rescue or manufacturing. By overcoming these issues, visual perception and cognition can potentially be enhanced past innate human ability. This paper describes the notion of Parallel Consciousness, the thought that technology can function as an extension of human memory and cognition, and outlines a framework to implement such an interface using AR. This involves understanding both the environment and user's mental and visual states to more effectively augment vision, and managing the retrieval of content to improve enhancements and assist both cognitive function and memory. To achieve these goals, we are exploring unique combinations of eye tracking and Artificial Intelligence (AI) to help monitor user attention and cognitive state. We hypothesize that by using these resulting states in conjunction with environmental analysis, we can better automate the retrieval and merge of virtual content into a user's view.
AB - In recent years, see-through display technology has begun to reach a point where we will have the ability to continuously display virtual information in a variety of real world situations. However, augmented reality (AR) interfaces are currently limited in their ability to interact with the wearer and environment to provide specific, safe, and useful information when needed. Moreover, many questions remain about how to make content more relevant, especially in dynamic applications like rescue or manufacturing. By overcoming these issues, visual perception and cognition can potentially be enhanced past innate human ability. This paper describes the notion of Parallel Consciousness, the thought that technology can function as an extension of human memory and cognition, and outlines a framework to implement such an interface using AR. This involves understanding both the environment and user's mental and visual states to more effectively augment vision, and managing the retrieval of content to improve enhancements and assist both cognitive function and memory. To achieve these goals, we are exploring unique combinations of eye tracking and Artificial Intelligence (AI) to help monitor user attention and cognitive state. We hypothesize that by using these resulting states in conjunction with environmental analysis, we can better automate the retrieval and merge of virtual content into a user's view.
KW - augmented reality
KW - cognitive state
KW - eye tracking
UR - http://www.scopus.com/inward/record.url?scp=85028527036&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85028527036&partnerID=8YFLogxK
U2 - 10.1109/ISUVR.2017.19
DO - 10.1109/ISUVR.2017.19
M3 - Conference contribution
AN - SCOPUS:85028527036
T3 - Proceedings - 2017 International Symposium on Ubiquitous Virtual Reality, ISUVR 2017
SP - 34
EP - 37
BT - Proceedings - 2017 International Symposium on Ubiquitous Virtual Reality, ISUVR 2017
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 27 June 2017 through 29 June 2017
ER -