< back to do.minik.us
To appear in CHI 2012
More information about the project in this blog post.
Portable projectors in mobile devices provide a promising way to overcome screen-space limitations on handhelds, navigate information, or augment reality. One of their appeals is the simplicity of interaction: Aiming at an appropriate surface projects the image, and changing posture and direction adjusts the image's position and orientation. This behavior is purely based on optics, allowing us to intuitively grasp it based on our own experience with the physical world. However, strict adherence to the laws of physics also has its drawbacks: The intensity of light varies with the projector's distance to the surface, and the projected image is tightly coupled to the projector's movement.
In this paper, we apply the metaphor of optical projection to digital surfaces in the environment. We use a handheld device, tracked in 6 DOF, to support Virtual Projection (VP) on one or more displays. The simulated nature of VP allows us to address some of the limitations of optical projection, avoiding unwanted distortions, jitter, and intensity variations, and eliminating the need to continually point the projector at the surface on which it is projecting. This also frees the frustum so that it can be used for selecting areas, either for navigation or for applying filters.
Our work makes several contributions: (1) We explore the implications of VP as an interaction technique and show how decoupling the projection from the projector and ad-justing transformations can improve interaction. (2) We describe relevant characteristics of VP. (3) We present an implemented software framework for creating VP applications for consumer smartphones that does not require external tracking, and show exemplary use cases. (4) We report on a user study comparing VP with both absolute and rela-tive techniques for content placement using a handheld device. Our findings suggest that VP is especially suitable for complex (i.e., translate-scale-rotate) projections.
VP walkthrough (click to enlarge): (a) Shaking the device to create a view. (b) Interacting with a non-projected view. (c-e) Creating a projection by aiming at the secondary display, long pressing, and releasing. (f) Synchronized interaction. (g) Projection frustum can be used for filtering or navigating. (h) Projections can be deleted by aiming, long pressing, and dragging out of the display.
Baur, D., Boring, S. and Feiner, S.
Virtual Projection: Exploring Optical Projection as a Metaphor for Multi-Device Interaction
To appear: In ACM International Conference on Human Factors in Computing Systems - CHI 2012. Austin, TX, USA, ACM Press, 10 pages, May 5-10.
CHI 2012 paper (coming soon).