Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multimodal Zero-Shot Learning for Tactile Texture Recognition (2306.12705v1)

Published 22 Jun 2023 in cs.RO

Abstract: Tactile sensing plays an irreplaceable role in robotic material recognition. It enables robots to distinguish material properties such as their local geometry and textures, especially for materials like textiles. However, most tactile recognition methods can only classify known materials that have been touched and trained with tactile data, yet cannot classify unknown materials that are not trained with tactile data. To solve this problem, we propose a tactile zero-shot learning framework to recognise unknown materials when they are touched for the first time without requiring training tactile samples. The visual modality, providing tactile cues from sight, and semantic attributes, giving high-level characteristics, are combined together to bridge the gap between touched classes and untouched classes. A generative model is learnt to synthesise tactile features according to corresponding visual images and semantic embeddings, and then a classifier can be trained using the synthesised tactile features of untouched materials for zero-shot recognition. Extensive experiments demonstrate that our proposed multimodal generative model can achieve a high recognition accuracy of 83.06% in classifying materials that were not touched before. The robotic experiment demo and the dataset are available at https://sites.google.com/view/multimodalzsl.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (57)
  1. Tactile sensing for mobile manipulation, IEEE Trans. Robot. 27 (2011) 558–568.
  2. Surface sensing and classification for efficient mobile robot navigation, in: ICRA, volume 2, 1996, pp. 1224–1228.
  3. S. A. Hassan, C. M. Oddo, Tactile sensors for material recognition in social and collaborative robots: A brief review, in: MeMeA, 2022, pp. 1–5.
  4. Exploring features in a bayesian framework for material recognition, in: CVPR, 2010, pp. 239–246.
  5. Learning methods for generic object recognition with invariance to pose and lighting, in: CVPR, volume 2, 2004, pp. II–104.
  6. D. G. Lowe, Object recognition from local scale-invariant features, in: ICCV, volume 2, 1999, pp. 1150–1157.
  7. Describing textures in the wild, in: CVPR, 2014, pp. 3606–3613.
  8. Deep multiple-attribute-perceived network for real-world texture recognition, in: ICCV, 2019, pp. 3613–3622.
  9. Robotic tactile perception of object properties: A review, Mechatronics 48 (2017) 54–67.
  10. P. Giguere, G. Dudek, A simple tactile probe for surface identification by mobile robots, IEEE Trans. Robot. 27 (2011) 534–544.
  11. Material recognition using tactile sensing, Expert Syst. Appl. 94 (2018) 94–111.
  12. Cross-modal transfer in man, Nature 191 (1961) 1225–1226.
  13. The coordination of visual and tactual input in infants, Percept. psychophys. 8 (1970) 51–53.
  14. “touching to see” and “seeing to feel”: Robotic cross-modal sensory data generation for visual-tactile perception, in: ICRA, 2019, pp. 4276–4282.
  15. Knock-knock: acoustic object recognition by using stacked denoising autoencoders, Neurocomputing 267 (2017) 18–24.
  16. Extracting textural features from tactile sensors, Bioinspir. Biomim. 3 (2008) 035002.
  17. N. Jamali, C. Sammut, Majority voting: Material classification by tactile sensing using surface texture, IEEE Trans. Robot. 27 (2011) 508–521.
  18. Tactile texture recognition with a 3-axial force mems integrated artificial finger., in: RSS, 2009, pp. 49–56.
  19. Texture classification using a polymer-based mems tactile sensor, J. Micromech. Microeng. 15 (2005) 912.
  20. Towards effective tactile identification of textures using a hybrid touch approach, in: ICRA, 2019, pp. 4269–4275.
  21. Tactile-based object recognition using a grasp-centric exploration, in: CASE, IEEE, 2021, pp. 494–501.
  22. Visual affordance guided tactile material recognition for waste recycling, IEEE Trans. Autom. Sci. Eng. (2021).
  23. Gelsight: High-resolution robot tactile sensors for estimating geometry and force, Sensors 17 (2017) 2762.
  24. The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies, Soft Robot. 5 (2018) 216–227.
  25. Active clothing material perception using tactile sensing and deep learning, in: ICRA, 2018, pp. 4842–4849.
  26. Spatio-temporal attention model for tactile texture recognition, in: IROS, 2020, pp. 9896–9902.
  27. Vitac: Feature sharing between vision and tactile sensing for cloth texture recognition, in: ICRA, 2018, pp. 2722–2727.
  28. Attribute-based classification for zero-shot visual object categorization, IEEE Trans. Pattern Anal. Mach. Intell. 36 (2013) 453–465.
  29. Zero-shot learning through cross-modal transfer, in: Adv. Neural Inf. Process. Syst., 2013, pp. 935–943.
  30. D. Das, C. G. Lee, Zero-shot image recognition using relational matching, adaptation and calibration, in: IJCNN, 2019, pp. 1–8.
  31. Learning a deep embedding model for zero-shot learning, in: CVPR, 2017, pp. 2021–2030.
  32. Label-embedding for attribute-based classification, in: CVPR, 2013, pp. 819–826.
  33. Predicting deep zero-shot convolutional neural networks using textual descriptions, in: ICCV, 2015, pp. 4247–4255.
  34. Transductive multi-view zero-shot learning, IEEE Trans. Pattern Anal. Mach. Intell. 37 (2015) 2332–2345.
  35. Feature generating networks for zero-shot learning, in: CVPR, 2018, pp. 5542–5551.
  36. Multi-modal cycle-consistent generalized zero-shot learning, in: ECCV, 2018, pp. 21–37.
  37. Cross-modal zero-shot-learning for tactile object recognition, IEEE Trans. Syst., Man, Cybern.: Syst. 50 (2018) 2466–2474.
  38. Haptic zero-shot learning: Recognition of objects never touched before, Robot. Auton. Syst. 105 (2018a) 11–25.
  39. Visuo-tactile recognition of daily-life objects never seen or touched before, in: ICARCV, 2018b, pp. 1765–1770.
  40. A deep learning framework for tactile recognition of known as well as novel objects, IEEE Trans. Ind. Inform. 16 (2019) 423–432.
  41. Alleviating domain shift via discriminative learning for generalized zero-shot learning, IEEE Trans. Multimed. (2021).
  42. Learning modality-invariant latent representations for generalized zero-shot learning, in: 28th ACM Int. Conf. Multimedia, 2020, pp. 1348–1356.
  43. An empirical study and analysis of generalized zero-shot learning for object recognition in the wild, in: ECCV, 2016, pp. 52–68.
  44. Cvae-gan: fine-grained image generation through asymmetric training, in: ICCV, 2017, pp. 2745–2754.
  45. Multi-stage variational auto-encoders for coarse-to-fine image generation, in: SDM, 2019, pp. 630–638.
  46. Generative adversarial nets, Adv. Neural Inf. Process. Syst. 27 (2014).
  47. Dist-gan: An improved gan using distance constraints, in: Proceedings of the European conference on computer vision (ECCV), 2018, pp. 370–385.
  48. Deep residual learning for image recognition, in: CVPR, 2016, pp. 770–778.
  49. Squeeze-and-excitation networks, in: CVPR, 2018, pp. 7132–7141.
  50. D. P. Kingma, J. Ba, Adam: A method for stochastic optimization, arXiv preprint arXiv:1412.6980 (2014).
  51. Robotic learning of haptic adjectives through physical interaction, Robot. Auton. Syst. 63 (2015) 279–292.
  52. Connecting look and feel: Associating the visual and tactile properties of physical materials, in: CVPR, 2017, pp. 5580–5588.
  53. P. Venkatraman, Fabric properties and their characteristics, Materials and technology for sportswear and performance apparel (2015) 53–86.
  54. Estimating object hardness with a gelsight touch sensor, in: IROS, 2016, pp. 208–215.
  55. Unsupervised domain adaptation for zero-shot learning, in: ICCV, 2015, pp. 2452–2460.
  56. Semantic autoencoder for zero-shot learning, in: CVPR, 2017, pp. 3174–3183.
  57. Zero-shot learning via class-conditioned deep generative models, in: AAAI, 2018.
Citations (5)

Summary

We haven't generated a summary for this paper yet.