Avatarm: an Avatar With Manipulation Capabilities for the Physical Metaverse (2303.15187v2)
Abstract: Metaverse is an immersive shared space that remote users can access through virtual and augmented reality interfaces, enabling their avatars to interact with each other and the surrounding. Although digital objects can be manipulated, physical objects cannot be touched, grasped, or moved within the metaverse due to the lack of a suitable interface. This work proposes a solution to overcome this limitation by introducing the concept of a Physical Metaverse enabled by a new interface named "Avatarm". The Avatarm consists in an avatar enhanced with a robotic arm that performs physical manipulation tasks while remaining entirely hidden in the metaverse. The users have the illusion that the avatar is directly manipulating objects without the mediation by a robot. The Avatarm is the first step towards a new metaverse, the "Physical Metaverse", where users can physically interact each other and with the environment.
- R. V. Kozinets, “Immersive netnography: a novel method for service experience research in virtual reality, augmented reality and metaverse contexts,” Journal of Service Management, 2022.
- C. Helman. Defining The Metaverse Today. [Online]. Available: https://www.forbes.com/sites/cathyhackl/2021/05/02/defining-the-metaverse-today/
- D. Prattichizzo, F. Chinello, C. Pacchierotti, and K. Minamizawa, “Remotouch: A system for remote touch experience,” in 19th International Symposium in Robot and Human Interactive Communication. IEEE, 2010, pp. 676–679.
- T. Lisini Baldi, S. Scheggi, L. Meli, M. Mohammadi, and D. Prattichizzo, “Gesto: A glove for enhanced sensing and touching based on inertial and magnetic sensors for hand tracking and cutaneous feedback,” IEEE Transactions on Human-Machine Systems, vol. 47, no. 6, pp. 1066–1076, 2017.
- G. Huisman, T. Lisini Baldi, N. D’Aurizio, and D. Prattichizzo, “Feedback of head gestures in audio-haptic remote communication,” in 2021 International Symposium on Wearable Computers, 2021, pp. 135–137.
- S. A. Hossain, A. S. M. M. Rahman, and A. El Saddik, “Measurements of multimodal approach to haptic interaction in second life interpersonal communication system,” IEEE Transactions on Instrumentation and Measurement, vol. 60, no. 11, pp. 3547–3558, 2011.
- F. Sanfilippo, L. I. Hatledal, and K. Pettersen, “A fully–immersive hapto–audio–visual framework for remote touch,” in Proc. of the 11th IEEE International Conference on Innovations in Information Technology (IIT’15), Dubai, United Arab Emirates, 2015.
- C. Eckert and J.-F. Boujut, “The role of objects in design co-operation: communication through physical or virtual objects,” Computer Supported Cooperative Work, vol. 12, no. 2, pp. 145–151, 2003.
- M. Brereton and B. McGarry, “An observational study of how objects support engineering design thinking and communication: implications for the design of tangible media,” in Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 2000, pp. 217–224.
- D. Nicolini, J. Mengis, and J. Swan, “Understanding the role of objects in cross-disciplinary collaboration,” Organization science, vol. 23, no. 3, pp. 612–629, 2012.
- C. Bueger and J. Stockbruegger, “Actor-network theory: objects and actants, networks and narratives,” in Technology and World Politics. Routledge, 2017, pp. 42–59.
- S. Mori, S. Ikeda, and H. Saito, “A survey of diminished reality: Techniques for visually concealing, eliminating, and seeing through real objects,” IPSJ Transactions on Computer Vision and Applications, vol. 9, no. 1, pp. 1–14, 2017.
- A. V. Taylor, A. Matsumoto, E. J. Carter, A. Plopski, and H. Admoni, “Diminished reality for close quarters robotic telemanipulation,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2020, pp. 11 531–11 538.
- A. T. Miller, S. Knoop, H. I. Christensen, and P. K. Allen, “Automatic grasp planning using shape primitives,” in 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), vol. 2. IEEE, 2003, pp. 1824–1829.
- Y. Lin and Y. Sun, “Robot grasp planning based on demonstrated grasp strategies,” The International Journal of Robotics Research, vol. 34, no. 1, pp. 26–42, 2015.
- D. Kappler, J. Bohg, and S. Schaal, “Leveraging big data for grasp planning,” in 2015 IEEE international conference on robotics and automation (ICRA). IEEE, 2015, pp. 4304–4311.
- G. Saponaro, G. Salvi, and A. Bernardino, “Robot anticipation of human intentions through continuous gesture recognition,” in 2013 International Conference on Collaboration Technologies and Systems (CTS). IEEE, 2013, pp. 218–225.
- W. Miao, G. Li, G. Jiang, Y. Fang, Z. Ju, and H. Liu, “Optimal grasp planning of multi-fingered robotic hands: a review,” Appl. Comput. Math, vol. 14, no. 3, pp. 238–247, 2015.
- A. Gasparetto, P. Boscariol, A. Lanzutti, and R. Vidoni, “Path planning and trajectory planning algorithms: A general overview,” Motion and operation planning of robotic systems, pp. 3–27, 2015.
- D. Prattichizzo and J. C. Trinkle, “Grasping,” in Springer handbook of robotics. Springer, 2016, pp. 955–988.
- ISO, “Robots for industrial environments–safety requirements–part 1: robot, iso10218-1: 2006,” 2006.