Using Augmented Reality to detect eye direction.

We prototyped internally an app that would play video until you stopped looking at it using ARKit and Swift on iPhone TrueDepth IR Cameras

The idea was to see how long people could stay watching the content vs. their surroundings. And queue users into their lack of focus.

If a user gazed too far off from the screen variable blurring would occur using a ZoomBlur. Depth was also measured as a means to blur the screen if the user's face was too far away.

Project released: 2019My responsibilities included prototyping and development.