With the advent of transparent wearable displays, new challenges arise. Because these devices are always in front of the eyes, one of the main challenges is to create interfaces that do not disturb the user from what they might be doing in "the real world" by distracting them with sudden notifications from "the virtual world" for example. For this task, it would be useful for the user interface to know on which world the user's attention is located: if the UI knows they are busy in the real world, it could know not to disturb the user.
During my internship at Nokia, I looked into the information that the wearer's eye movements might provide about where their attention is located. I created and implemented several techniques to do so, each of which do not indicate absolute certainty but which, combined, can form a robust system to detect if the wearer is looking at the screen or through the screen.
For example, when we look at an object in the real world while moving our head, our eyes perform a smooth tracking motion to keep the object in sight, which is called VOR (for Vestibulo-Ocular Reflex). However, I noticed that when we look at something on the screen and move our head, the eyes stay still and do not need to perform the VOR, since the screen is moving with the head. Thus, if we track head motion and then observe what the eyes are doing, we can get a good estimation of where the attention is located.
Once the UI knows where the user is looking, the eyes can still help to manage the information in a subtle and discreet way. It can for example take advantage of the fact that, afetr a blink, we do not notice large changes that might have happened suddenly in our visual field. This is known as change blindness. If the UI needs to be updated to be kept up-to-date or display a notification, it can wait until the user blinks, and slowly fade the notification in every time a blink happens.
There are more ideas and techniques described in the paper!