Pose-free object classification from surface contact features in sequences of Robotic grasps (2403.19840v1)
Abstract: In this work, we propose two cost efficient methods for object identification, using a multi-fingered robotic hand equipped with proprioceptive sensing. Both methods are trained on known objects and rely on a limited set of features, obtained during a few grasps on an object. Contrary to most methods in the literature, our methods do not rely on the knowledge of the relative pose between object and hand, which greatly expands the domain of application. However, if that knowledge is available, we propose an additional active exploration step that reduces the overall number of grasps required for a good recognition of the object. One of the methods depends on the contact positions and normals and the other depends on the contact positions alone. We test the proposed methods in the GraspIt! simulator and show that haptic-based object classification is possible in pose-free conditions. We evaluate the parameters that produce the most accurate results and require the least number of grasps for classification.
- R. Bogue “Humanoid robots from the past to the present” In Industrial Robot 47.4, 2020, pp. 465–472
- Richard Hodson “How robots are grasping the art of gripping” In Nature 557, 2018, pp. S23–S25
- “Extracting object properties through haptic exploration” In Acta Psychologica 84(1), 1993, pp. 29–40
- “Graspit! A versatile simulator for robotic grasping” In IEEE Robotics Automation Magazine 11.4, 2004, pp. 110–122
- Tony Prescott, Mathew Diamond and Alan Wing “Active touch sensing” In Philosophical transactions of the Royal Society of London. Series B, Biological sciences 366, 2011, pp. 2989–95
- Burr Settles “Active Learning Literature Survey”, 2009
- “Robotic Tactile Perception of Object Properties: A Review” In Mechatronics 48, 2017, pp. 54–67
- “GPAtlasRRT: A Local Tactile Exploration Planner for Recovering the Shape of Novel Objects” In International Journal of Humanoid Robotics 15.01, 2018, pp. 1850014
- “Lump detection with a gelsight sensor” In 2013 World Haptics Conference (WHC), 2013, pp. 175–179
- “Identifying objects from a haptic glance” In Perception & Psychophysics 57, 1995, pp. 1111–1123
- J.F. Norman, H.F. Norman and A.M. Clayton “The visual and haptic perception of natural object shape” In Perception & Psychophysics 66, 2004, pp. 342–351
- “Object identification with tactile sensors using bag-of-features” In 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2009, pp. 243–248
- “Featureless classification of tactile contacts in a gripper using neural networks” In Sensors Actuators A Phy. 62, 1997, pp. 488–491
- Hao Dang, Jonathan Weisz and Peter K. Allen “Blind grasping: Stable robotic grasping using tactile feedback and hand kinematics” In 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 5917–5922
- Hao Dang and Peter K. Allen “Grasp adjustment on novel objects using tactile experience from similar local geometry” In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013, pp. 4007–4012
- “Benchmarking in Manipulation Research: Using the Yale-CMU-Berkeley Object and Model Set” In IEEE Robotics Automation Magazine 22.3, 2015, pp. 36–52
- U. Martinez-Hernandez, T.J. Dodd and T.J. Prescott “Feeling the Shape: Active Exploration Behaviors for Object Recognition With a Robotic Hand” In IEEE Transactions on Systems, Man, and Cybernetics: Systems 48.12, 2018, pp. 2339–2348
- “Probabilistic Robotics”, Intelligent Robotics and Autonomous Agents series MIT Press, 2005
- “Model globally, match locally: Efficient and robust 3D object recognition” In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2010, pp. 998–1005
- “OptoForce 3D optical sensors and six axis Force/Torque sensors: Sensing Flexibility” https://appliedmeasurement.com.au/optoforce-3d-optical-sensors-and-six-axis-forcetorque-sensors/
- “Low-cost 3-axis soft tactile sensors for the human-friendly robot Vizzy” In 2017 IEEE International Conference on Robotics and Automation (ICRA), 2017, pp. 966–971