ScreenJump: An AR-facilitated User-centric Interaction System for Fine-grained Resource Manipulation Across Displays

Abstract

There is an increasing demand for remote manipulation of digital resources across different computers. In this paper, we propose ScreenJump, an AR-facilitated cross-device interaction system that enables user-centric, fine-grained resource manipulation across computer displays. Each computer encodes the identity information into screen blinks that can be detected by cameras but are invisible to human eyes. The AR headset with cameras then localizes surrounding displays and connects with the corresponding computers. The relative positions of fine-grained resources (e.g. pictures, text paragraphs, UI elements) within each display are calculated and shared with the AR headset. Users can then select and manipulate such resources by performing in-air gestures. We explain the system design and implementation in detail, as well as three application use cases that potentially benefit from ScreenJump.

Publication
In Adjunct Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.