Papers
Topics
Authors
Recent
2000 character limit reached

Interactive Learning of Physical Object Properties Through Robot Manipulation and Database of Object Measurements (2404.07344v2)

Published 10 Apr 2024 in cs.RO, cs.AI, cs.IT, and math.IT

Abstract: This work presents a framework for automatically extracting physical object properties, such as material composition, mass, volume, and stiffness, through robot manipulation and a database of object measurements. The framework involves exploratory action selection to maximize learning about objects on a table. A Bayesian network models conditional dependencies between object properties, incorporating prior probability distributions and uncertainty associated with measurement actions. The algorithm selects optimal exploratory actions based on expected information gain and updates object properties through Bayesian inference. Experimental evaluation demonstrates effective action selection compared to a baseline and correct termination of the experiments if there is nothing more to be learned. The algorithm proved to behave intelligently when presented with trick objects with material properties in conflict with their appearance. The robot pipeline integrates with a logging module and an online database of objects, containing over 24,000 measurements of 63 objects with different grippers. All code and data are publicly available, facilitating automatic digitization of objects and their physical properties through exploratory manipulations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. J. K. Behrens, M. Nazarczuk, K. Stepanova, M. Hoffmann, Y. Demiris, and K. Mikolajczyk, “Embodied reasoning for discovering object properties via manipulation,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 10 139–10 145.
  2. A. Dutta, E. Burdet, and M. Kaboli, “Push to know!-visuo-tactile based active object parameter inference with dual differentiable filtering,” in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2023, pp. 3137–3144.
  3. T. N. Le, F. Verdoja, F. J. Abu-Dakka, and V. Kyrki, “Probabilistic surface friction estimation based on visual and haptic measurements,” IEEE Robotics and Automation Letters, vol. 6, no. 2, 2021.
  4. D. Xu, G. E. Loeb, and J. A. Fishel, “Tactile identification of objects using Bayesian exploration,” in 2013 IEEE International Conference on Robotics and Automation.   IEEE, 2013, pp. 3056–3061.
  5. A. Petrovskaya and K. Hsiao, “Active manipulation for perception,” Springer Handbook of Robotics, pp. 1037–1062, 2016.
  6. V. Chu, I. McMahon, L. Riano, C. G. McDonald, Q. He, J. M. Perez-Tejada, M. Arrigo, N. Fitter, J. C. Nappo, T. Darrell, et al., “Using robotic exploratory procedures to learn the meaning of haptic adjectives,” in 2013 IEEE International Conference on Robotics and Automation.   IEEE, 2013, pp. 3048–3055.
  7. J. A. Fishel and G. E. Loeb, “Bayesian Exploration for Intelligent Identification of Textures,” Frontiers in neurorobotics, vol. 6, p. 4, 2012.
  8. M. Mahendran, “The Modulus of Elasticity of Steel - Is It 200 GPa?” International Specialty Conference on Cold-Formed Steel Structures, vol. 5, Oct 1996.
  9. M. Dondi, E. G., M. M., M. C., and C. Mingazzini, “The chemical composition of porcelain stoneware tiles and its influence on microstructure and mechanical properties,” InterCeram: International Ceramic Review, vol. 48, pp. 75–83, 01 1999.
  10. T. Húlan and I. Štubňa, “Young’s modulus of kaolinite-illite mixtures during firing,” Applied Clay Science, vol. 190, p. 105584, 2020.
  11. B. Calli, A. Singh, A. Walsman, S. Srinivasa, P. Abbeel, and A. M. Dollar, “The YCB object and Model set: Towards common benchmarks for manipulation research,” in 2015 International Conference on Advanced Robotics (ICAR), 2015, pp. 510–517.
  12. S. P. Patni, P. Stoudek, H. Chlup, and M. Hoffmann, “Online elasticity estimation and material sorting using standard robot grippers,” arXiv preprint arXiv:2401.08298, 2024.
  13. M. Dimiccoli, S. Patni, M. Hoffmann, and F. Moreno-Noguer, “Recognizing object surface material from impact sounds for robot manipulation,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2022, pp. 9280–9287.
  14. Y. Wu, A. Kirillov, F. Massa, W.-Y. Lo, and R. Girshick, “Detectron2,” https://github.com/facebookresearch/detectron2, 2019.
  15. M. Nazarczuk and K. Mikolajczyk, “SHOP-VRB: A Visual Reasoning Benchmark for Object Perception,” International Conference on Robotics and Automation (ICRA), 2020.
  16. Z. Chen, K. Yi, Y. Li, M. Ding, A. Torralba, J. B. Tenenbaum, and C. Gan, “Comphy: Compositional physical reasoning of objects and events from videos,” in International Conference on Learning Representations, 2022.
  17. M. Purri and K. Dana, “Teaching cameras to feel: Estimating tactile physical properties of surfaces from images,” in European Conference on Computer Vision.   Springer, 2020, pp. 1–20.
  18. M. Thosar, C. A. Mueller, G. Jäger, J. Schleiss, N. Pulugu, R. Mallikarjun Chennaboina, S. V. Rao Jeevangekar, A. Birk, M. Pfingsthorn, and S. Zug, “From Multi-Modal Property Dataset to Robot-Centric Conceptual Knowledge About Household Objects,” Frontiers in Robotics and AI, vol. 8, p. 87, 2021.
  19. Open X-Embodiment Collaboration, A. Padalkar, et al., “Open X-Embodiment: Robotic learning datasets and RT-X models,” https://arxiv.org/abs/2310.08864, 2023.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.