Search
Generic filters
Exact matches only
Filter by content type
Users
Attachments

GEAR: A Gaze-Enabled Augmented Reality System

Gaze-enabled systems have been proposed decades ago but the computing power and connectivity of mobile devices enable just recently an interactive user and environment understanding in systems that are ergonomic and affordable. We believe that promising and generalizable solutions can now be achieved by developing a gaze-enabled Augmented Reality (AR) system (GEAR) that can dynamically adapt the information load on a display to the changing situation of its user and surroundings.

For instance, in an industrial setup, safety is one of the concerns when a human operator interacts with a robot. In this context, we will develop a system that will employ a mobile eye tracker as a diagnostic tool to understand the user’s cognitive situation. GEAR aims directing its user’s attention to relevant artifacts, which can be physical (e.g., emergency button or a robot) or virtual (e.g., interfaces to control the robot’s speed). Furthermore, GEAR will need to integrate information about the user, the robot, and heterogeneous virtual and physical artifacts that are available to the system. Thus, we plan to carry out a contextual assessment by integrating available sensors in the environment, semantic models and information about the user, and even state and prediction models of the robot and its behavior. Finally, GEAR will also need to be able to highlight safety-specific information within the operator’s visual field using an AR display. Additional details can be found here.

This project is funded by the University of St. Gallen and started in May 2021.

Author: Dr. Kenan Bektas

Date: 20. May 2021