Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preprint Touch-less Interactive Augmented Reality Game on Vision Based Wearable Device (1504.06359v5)

Published 23 Apr 2015 in cs.HC

Abstract: This is the preprint version of our paper on Personal and Ubiquitous Computing. There is an increasing interest in creating pervasive games based on emerging interaction technologies. In order to develop touch-less, interactive and augmented reality games on vision-based wearable device, a touch-less motion interaction technology is designed and evaluated in this work. Users interact with the augmented reality games with dynamic hands/feet gestures in front of the camera, which triggers the interaction event to interact with the virtual object in the scene. Three primitive augmented reality games with eleven dynamic gestures are developed based on the proposed touch-less interaction technology as proof. At last, a comparing evaluation is proposed to demonstrate the social acceptability and usability of the touch-less approach, running on a hybrid wearable framework or with Google Glass, as well as workload assessment, user's emotions and satisfaction.

Citations (321)

Summary

  • The paper introduces a novel vision-based gesture interaction method using contour matching and TLD frameworks to enable touch-less AR gameplay.
  • It demonstrates robust real-time performance with over 10 FPS and an impressive 99.76% gesture recognition accuracy, addressing occlusion and fat finger challenges.
  • User studies on devices like Google Glass confirm enhanced interaction satisfaction, suggesting broader applications beyond gaming.

Overview of Touch-less Interactive Augmented Reality Games on Vision-Based Wearable Devices

This paper explores the development of touch-less interactive augmented reality (AR) games utilizing vision-based wearable devices. The authors propose a sophisticated motion interaction technology that enables users to engage with virtual objects through dynamic hand and foot gestures detected by cameras. The proposed method addresses prevailing issues in AR interactions, such as the occlusion and fat finger problems, by eliminating the need for direct touch inputs which enhances interaction precision.

Key Contributions

Three primitive AR games are developed to demonstrate the efficacy of the touch-less interaction technology. The system comprises two integral components:

  1. Vision-based Wearable Hardware: This hardware consists of two types of wearable devices— a hybrid framework which mounts a smartphone on the wrist or knee, and Google Glass.
  2. Touch-less Interaction Software: The software component utilizes advanced gesture recognition algorithms, specifically the contour-based template matching (CTM) and tracking-learning-detection (TLD) frameworks, to achieve accurate real-time gesture interactions.

The CTM algorithm uses a dynamic programming approach to localize the hand/foot contours and employs lightweight skin detection to enhance tracking. The TLD framework fortifies the tracking process, making it suitable for mobile devices.

Technical Evaluation and Performance

The authors report that the system achieves real-time performance at larger than 10 frames per second, even under fast motion and image blur. The gesture recognition exhibits a success rate of 99.76% across tested video datasets, highlighting the system’s robustness. A notable computational efficiency is observed, where touch-less interaction consumes 20% more battery power than touch-based interaction in trial scenarios.

However, potential battery savings of 12% are noted when devices run without active gesture computation. This implies that the touch-less technology is effective in conserving power when interaction demands are reduced.

Implications and User Study

The paper provides substantial insights into the implications of touch-less interaction for user engagement with wearable devices. A comprehensive user paper investigates social acceptability, usability, user workload, emotional response, and satisfaction. The results indicate a strong user preference for the convenience and novel interaction style provided by touch-less interfaces, particularly when using smart glasses like Google Glass. The advanced interaction capabilities on this platform suggest potential shifts in how augmented reality can be integrated into daily user experiences.

Future Directions

The research delineates a promising pathway for augmented reality applications beyond mere gaming. Future work could explore full-body gesture interactions, integrate alternative sensor technologies to enhance interaction richness, and apply this interaction model to broader fields such as clinical assistance and geographic data visualization. This could catalyze a significant evolution in AR systems, making them more intuitive and seamlessly integrated into various contexts of the users' environment.

Conclusion

In conclusion, the paper presents a meticulously designed and evaluated system that pushes the boundaries of touch-less interaction in augmented reality games on wearable devices. Though initially constrained to hand and foot gestures, the implications for this technology are vast, potentially revolutionizing how users interface with digital environments across myriad applications. The paper demonstrates that touch-less interactions with wearables like Google Glass are not only feasible but greatly enhance the user experience, setting the stage for future technological advancements and applications.