Computer Science Graduate Seminar
Wednesday, May 26, 2021, 2:00pm
Interaction Techniques for Mid-Air Pen Input in Handheld Augmented Reality
- Philipp Wacker, M.Sc. – Chair for Computer Science 10
- Zoom: https://rwth.zoom.us/j/92362727123?pwd=MjhBT2VaczI0RDE1c2VNQkxGckw3UT09
Augmented Reality changes the way we interact with virtual information. Currently, virtual information is shown on 2D screens, separated from the real world. With Augmented Reality, virtual content can be shown directly embedded in the real world. This opens up the area of situated modeling in which virtual models are designed in context of the real world to, for example, print them out using a 3D printer. In an initial study, we show that sketching on physical objects improves stroke accuracy compared to strokes on virtual objects, and that features guiding a stroke, either through a concave or convex shape or through a visual guide, further improve the accuracy especially for physical objects.
The most available form of Augmented Reality (AR) is Handheld Augmented Reality which shows the virtual information embedded in the camera view of everyday smartphones or tablets. However, continuously specifying a 3D position—needed, e.g., for drawing in mid-air—is not directly possible in today’s systems. We build the ARPen system to allow for situated modeling in Handheld AR, requiring only a 3D-printed pen and a consumer smartphone. But many essential interactions are not yet clear for such a bimanual system. We design and evaluate selection & manipulation techniques to adjust the pose of a mid-air object, as well as menu techniques to control properties of objects in the scene. We show that ray-casting techniques, especially through the tip of the pen, generally perform well. However, interacting on the touchscreen or even combinations of both touchscreen and mid-air input also achieve promising results. To overcome perception issues of determining the depth of virtual objects in Handheld AR, we design depth visualizations that show the position of the pen tip in relation to other objects in the scene. We identify that a heatmap visualization, coloring every object in the scene depending on their distance to the pen tip, achieves best results and was preferred by study participants.
We release the ARPen system as an open-source toolbox, enabling researchers to implement and evaluate interaction techniques for Handheld AR with a mid-air pen. Our findings on essential interaction techniques provide a starting point for the development and evaluation of specialized application scenarios.
The computer science lecturers invite interested people to join.