Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lattice-based shape tracking and servoing of elastic objects (2209.01832v3)

Published 5 Sep 2022 in cs.RO

Abstract: In this paper, we propose a general unified tracking-servoing approach for controlling the shape of elastic deformable objects using robotic arms. Our approach works by forming a lattice around the object, binding the object to the lattice, and tracking and servoing the lattice instead of the object. This makes our approach have full 3D control over deformable objects of any general form (linear, thin-shell, volumetric). Furthermore, it decouples the runtime complexity of the approach from the objects' geometric complexity. Our approach is based on the As-Rigid-As-Possible (ARAP) deformation model. It requires no mechanical parameter of the object to be known and can drive the object toward desired shapes through large deformations. The inputs to our approach are the point cloud of the object's surface in its rest shape and the point cloud captured by a 3D camera in each frame. Overall, our approach is more broadly applicable than existing approaches. We validate the efficiency of our approach through numerous experiments with deformable objects of various shapes and materials (paper, rubber, plastic, foam). Experiment videos are available on the project website: https://sites.google.com/view/tracking-servoing-approach.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (58)
  1. J. Zhu, B. Navarro, P. Fraisse, A. Crosnier, and A. Cherubini, “Dual-arm robotic manipulation of flexible cables,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 479–484.
  2. M. Aranda, J. A. Corrales Ramon, Y. Mezouar, A. Bartoli, and E. Özgür, “Monocular visual shape tracking and servoing for isometrically deforming objects,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020, pp. 7542–7549.
  3. A. Koessler, N. R. Filella, B. Bouzgarrou, L. Lequievre, and J.-A. Corrales Ramon, “An efficient approach to closed-loop shape control of deformable objects using finite element models,” in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 1637–1643.
  4. M. Shetab-Bushehri, M. Aranda, Y. Mezouar, and E. Özgür, “As-rigid-as-possible shape servoing,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 3898–3905, 2022.
  5. D. Navarro-Alarcon and Y.-H. Liu, “Fourier-based shape servoing: a new feedback method to actively deform soft objects into desired 2-D image contours,” IEEE Transactions on Robotics, vol. 34, no. 1, pp. 272–279, 2018.
  6. F. Alambeigi, Z. Wang, R. Hegeman, Y. Liu, and M. Armand, “Autonomous data-driven manipulation of unknown anisotropic deformable tissues using unmodelled continuum manipulators,” IEEE Robotics and Automation Letters, vol. 4, no. 2, pp. 254–261, 2019.
  7. C. Shin, P. W. Ferguson, S. A. Pedram, J. Ma, E. P. Dutson, and J. Rosen, “Autonomous tissue manipulation via surgical robot using learning based model predictive control,” in 2019 International Conference on Robotics and Automation (ICRA), 2019, pp. 3875–3881.
  8. Z. Hu, P. Sun, and J. Pan, “Three-dimensional deformable object manipulation using fast online Gaussian process regression,” IEEE Robotics and Automation Letters, vol. 3, no. 2, pp. 979–986, 2018.
  9. Z. Hu, T. Han, P. Sun, J. Pan, and D. Manocha, “3-D deformable object manipulation using deep neural networks,” IEEE Robotics and Automation Letters, vol. 4, no. 4, pp. 4255–4261, 2019.
  10. R. Lagneau, A. Krupa, and M. Marchal, “Automatic shape control of deformable wires based on model-free visual servoing,” IEEE Robotics and Automation Letters, vol. 5, no. 4, pp. 5252–5259, 2020.
  11. J. Qi, G. Ma, J. Zhu, P. Zhou, Y. Lyu, H. Zhang, and D. Navarro-Alarcon, “Contour moments based manipulation of composite rigid-deformable objects with finite time model estimation and shape/position control,” IEEE/ASME Transactions on Mechatronics, vol. 27, no. 5, pp. 2985–2996, 2022.
  12. T. Bretl and Z. McCarthy, “Quasi-static manipulation of a Kirchhoff elastic rod based on a geometric analysis of equilibrium configurations,” The International Journal of Robotics Research, vol. 33, no. 1, pp. 48–68, 2014.
  13. A. Sintov, S. Macenski, A. Borum, and T. Bretl, “Motion planning for dual-arm manipulation of elastic rods,” IEEE Robotics and Automation Letters, vol. 5, no. 4, pp. 6065–6072, 2020.
  14. N. Lv, J. Liu, and Y. Jia, “Dynamic modeling and control of deformable linear objects for single-arm and dual-arm robot manipulations,” IEEE Transactions on Robotics, vol. 38, no. 4, pp. 2341–2353, 2022.
  15. C. Wang, Y. Zhang, X. Zhang, Z. Wu, X. Zhu, S. Jin, T. Tang, and M. Tomizuka, “Offline-online learning of deformation model for cable manipulation with graph neural networks,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 5544–5551, 2022.
  16. O. Aghajanzadeh, M. Aranda, J. A. Corrales Ramon, C. Cariou, R. Lenain, and Y. Mezouar, “Adaptive deformation control for elastic linear objects,” Frontiers in Robotics and AI, vol. 9, p. 868459, 2022.
  17. O. Aghajanzadeh, M. Aranda, G. López-Nicolás, R. Lenain, and Y. Mezouar, “An offline geometric model for controlling the shape of elastic linear objects,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022, pp. 2175–2181.
  18. O. Aghajanzadeh, G. Picard, J. A. Corrales Ramon, C. Cariou, R. Lenain, and Y. Mezouar, “Optimal deformation control framework for elastic linear objects,” in 2022 18th IEEE International Conference on Automation Science and Engineering (CASE), 2022, pp. 722–728.
  19. D. Berenson, “Manipulation of deformable objects without modeling and simulating deformation,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013, pp. 4525–4532.
  20. D. McConachie, A. Dobson, M. Ruan, and D. Berenson, “Manipulating deformable objects by interleaving prediction, planning, and control,” Int. J. of Rob. Research, vol. 39, no. 8, pp. 957–982, 2020.
  21. J. Zhu, D. Navarro-Alarcon, R. Passama, and A. Cherubini, “Vision-based manipulation of deformable and rigid objects using subspace projections of 2D contours,” Robotics and Autonomous Systems, vol. 142, p. 103798, 2021.
  22. B. Thach, B. Y. Cho, A. Kuntz, and T. Hermans, “Learning visual shape control of novel 3D deformable objects from partial-view point clouds,” in 2022 International Conference on Robotics and Automation (ICRA), 2022, pp. 8274–8281.
  23. F. Ficuciello, A. Migliozzi, E. Coevoet, A. Petit, and C. Duriez, “FEM-based deformation control for dexterous manipulation of 3D soft objects,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2018, pp. 4007–4013.
  24. O. Sorkine and M. Alexa, “As-rigid-as-possible surface modeling,” in Eurographics Symp. on Geometry Processing, 2007, pp. 109–116.
  25. O. Sorkine, D. Cohen-Or, Y. Lipman, M. Alexa, C. Rössl, and H.-P. Seidel, “Laplacian surface editing,” in Proceedings of the 2004 Eurographics/ACM SIGGRAPH Symposium on Geometry Processing, 2004, pp. 175–184.
  26. T. Collins, A. Bartoli, N. Bourdel, and M. Canis, “Robust, real-time, dense and deformable 3D organ tracking in laparoscopic videos,” in International Conference on Medical Image Computing and Computer-Assisted Intervention, 2016, pp. 404–412.
  27. S. Parashar, D. Pizarro, A. Bartoli, and T. Collins, “As-rigid-as-possible volumetric shape-from-template,” in Proceedings of the IEEE International Conference on Computer Vision, 2015, pp. 891–899.
  28. D. Fuentes-Jimenez, D. Pizarro, D. Casillas-Pérez, T. Collins, and A. Bartoli, “Deep shape-from-template: Single-image quasi-isometric deformable registration and reconstruction,” Image and Vision Computing, vol. 127, p. 104531, 2022.
  29. D. Fuentes-Jimenez, D. Pizarro, D. Casillas-Perez, T. Collins, and A. Bartoli, “Texture-generic deep shape-from-template,” IEEE Access, vol. 9, pp. 75 211–75 230, 2021.
  30. T. Han, X. Zhao, P. Sun, and J. Pan, “Robust shape estimation for 3D deformable object manipulation,” Communications in Information and Systems, vol. 18, no. 2, pp. 107–124, 2018.
  31. M. Zollhöfer, E. Sert, G. Greiner, and J. Süßmuth, “GPU based ARAP deformation using volumetric lattices,” in Eurographics (Short Papers), 2012, pp. 85–88.
  32. A. Bartoli, Y. Gérard, F. Chadebecq, T. Collins, and D. Pizarro, “Shape-from-template,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37, no. 10, pp. 2099–2118, 2015.
  33. A. Chhatkuli, D. Pizarro, A. Bartoli, and T. Collins, “A stable analytical framework for isometric shape-from-template by surface integration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 5, pp. 833–850, 2016.
  34. M. Famouri, A. Bartoli, and Z. Azimifar, “Fast shape-from-template using local features,” Machine Vision and Applications, vol. 29, no. 1, pp. 73–93, 2018.
  35. E. Özgür and A. Bartoli, “Particle-SfT: A provably-convergent, fast shape-from-template algorithm,” International Journal of Computer Vision, vol. 123, no. 2, pp. 184–205, 2017.
  36. D. T. Ngo, J. Östlund, and P. Fua, “Template-based monocular 3D shape recovery using Laplacian meshes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 38, no. 1, pp. 172–187, 2015.
  37. T. Collins and A. Bartoli, “[POSTER] Realtime shape-from-template: System and applications,” in 2015 IEEE International Symposium on Mixed and Augmented Reality, 2015, pp. 116–119.
  38. Q. Liu-Yin, R. Yu, L. Agapito, A. Fitzgibbon, and C. Russell, “Better together: Joint reasoning for non-rigid 3D reconstruction with specularities and shading,” in Proceedings of the British Machine Vision Conference (BMVC), 2016, pp. 42.1–42.12.
  39. A. Pumarola, A. Agudo, L. Porzi, A. Sanfeliu, V. Lepetit, and F. Moreno-Noguer, “Geometry-aware network for non-rigid shape prediction from a single view,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018, pp. 4681–4690.
  40. V. Golyanik, S. Shimada, K. Varanasi, and D. Stricker, “HDM-net: Monocular non-rigid 3D reconstruction with learned deformation model,” in International Conference on Virtual Reality and Augmented Reality, 2018, pp. 51–72.
  41. S. Shimada, V. Golyanik, C. Theobalt, and D. Stricker, “IsMo-GAN: Adversarial learning for monocular non-rigid 3D reconstruction,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019, pp. 2876–2885.
  42. M. Habermann, W. Xu, H. Rhodin, M. Zollhöfer, G. Pons-Moll, and C. Theobalt, “NRST: Non-rigid surface tracking from monocular video,” in German Conference on Pattern Recognition, 2018, pp. 335–348.
  43. A. Petit, V. Lippiello, and B. Siciliano, “Real-time tracking of 3D elastic objects with an RGB-D sensor,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 3914–3921.
  44. T. Tang and M. Tomizuka, “Track deformable objects from point clouds with structure preserved registration,” The International Journal of Robotics Research, vol. 41, no. 6, pp. 599–614, 2022.
  45. R. A. Newcombe, D. Fox, and S. M. Seitz, “DynamicFusion: Reconstruction and tracking of non-rigid scenes in real-time,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2015, pp. 343–352.
  46. A. Tsoli and A. A. Argyros, “Tracking deformable surfaces that undergo topological changes using an RGB-D camera,” in 2016 Fourth International Conference on 3D Vision (3DV), 2016, pp. 333–341.
  47. D. Navarro-Alarcon, Y.-H. Liu, J. Romero, and P. Li, “Model-free visually servoed deformation control of elastic objects by robot manipulators,” IEEE Transactions on Robotics, vol. 29, no. 6, pp. 1457–1468, 2013.
  48. ——, “On the visual deformation servoing of compliant objects: uncalibrated control methods and experiments,” International Journal of Robotics Research, vol. 33, no. 11, pp. 1462–1480, 2014.
  49. S. Duenser, J. Bern, R. Poranne, and S. Coros, “Interactive robotic manipulation of elastic objects,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 3476–3481.
  50. J. Sanchez, K. Mohy El Dine, J. A. Corrales, B.-C. Bouzgarrou, and Y. Mezouar, “Blind manipulation of deformable objects based on force sensing and finite element modeling,” Frontiers in Robotics and AI, vol. 7, p. 73, 2020.
  51. M. Müller, B. Heidelberger, M. Hennix, and J. Ratcliff, “Position based dynamics,” Journal of Visual Communication and Image Representation, vol. 18, no. 2, pp. 109–118, 2007.
  52. V. H. Giraud, M. Padrin, M. Shetab-Bushehri, C. Bouzgarrou, Y. Mezouar, and E. Ozgur, “Optimal shape servoing with task-focused convergence constraints,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022, pp. 2197–2202.
  53. M. Ruan, D. McConachie, and D. Berenson, “Accounting for directional rigidity and constraints in control for manipulation of deformable objects without physical simulation,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 512–519.
  54. J. Matas, S. James, and A. J. Davison, “Sim-to-real reinforcement learning for deformable object manipulation,” in Conference on Robot Learning, 2018, pp. 734–743.
  55. R. Jangir, G. Alenya, and C. Torras, “Dynamic cloth manipulation with deep reinforcement learning,” in 2020 IEEE International Conference on Robotics and Automation (ICRA), 2020, pp. 4630–4636.
  56. R. Laezza and Y. Karayiannidis, “Learning shape control of elastoplastic deformable linear objects,” in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 4438–4444.
  57. M. H. Daniel Zakaria, M. Aranda, L. Lequièvre, S. Lengagne, J. A. Corrales Ramón, and Y. Mezouar, “Robotic control of the deformation of soft linear objects using deep reinforcement learning,” in 2022 18th IEEE International Conference on Automation Science (CASE), 2022, pp. 1516–1522.
  58. M. Yu, H. Zhong, and X. Li, “Shape control of deformable linear objects with offline and online learning of local linear deformation models,” in 2022 International Conference on Robotics and Automation (ICRA), 2022, pp. 1337–1343.
Citations (11)

Summary

  • The paper introduces a unified tracking-servoing approach using a 3D lattice to simplify deformation control for elastic objects.
  • It leverages the ARAP deformation model to achieve model-independent manipulation across diverse object forms.
  • Experimental trials demonstrate practical efficiency at 20-30 FPS, validating its applicability for advanced robotic manipulation tasks.

Insights into "Lattice-based Shape Tracking and Servoing of Elastic Objects"

The paper, "Lattice-based Shape Tracking and Servoing of Elastic Objects," introduces a robust solution for handling the manipulation of deformable elastic objects in robotics. This solution is characterized by an innovative use of lattices to control the shape of such objects, offering practicality in various robotic applications where controlling flexible materials is pivotal.

Key Contributions

Unified Tracking-Servoing Approach: The presented methodology integrates both tracking and servoing into a streamlined approach, using a 3D lattice to simplify and manage the deformation control of elastic objects. The lattice is employed as an intermediary structure around the object, thus abstracting the complex geometry of the object itself and simplifying computational requirements.

Applicability Across Object Forms: A significant strength of this approach lies in its versatility. It supports elastic objects of any general form, whether they are linear, thin-shell, or volumetric, and achieves full control of the object’s deformation in three-dimensional space.

Model Independence from Object Mechanics: This research notably does not rely on any prior mechanical parameters of the objects. Instead, it utilizes the well-established As-Rigid-As-Possible (ARAP) deformation model, facilitating operations without requiring specific mechanical or physical attributes of the materials involved.

Analytical Deformation Jacobian: The paper presents a novel analytical expression for the deformation Jacobian of the ARAP model. This innovation eliminates the need for numerical approximations, enhancing computational efficiency and precision during shape servoing.

Practical Efficiency: Leveraging a lattice structure, the proposed approach shifts much of the computational load associated with the object's geometrical complexity. This results in a significant increase in processing speed, achieving a notable 20-30 FPS during the experimental trials without parallel processing.

Experimental Validation

The approach is validated through diverse experiments involving various objects made from materials such as paper, rubber, plastic, and foam. These experiments simulated realistic tasks such as large-scale deformations and the manipulation of complex object shapes. The experimental results highlighted the system's capability to manage partial and complete shape changes effectively, even under conditions that tested the limits of current robotic manipulation techniques.

Implications and Future Directions

The implications of this research are manifold. In practical terms, it offers a flexible, robust solution for industries reliant on manipulating deformable materials, such as manufacturing and surgical robotics. Theoretically, the use of a lattice structure as a binding intermediary suggests a new paradigm in shape control that could inspire similar approaches across different domains of robotics and AI.

Looking forward, this methodology opens several avenues for future research. Expanding the system to handle non-convex geometries or incorporating environmental interactions remains a natural progression. Enabling shape servoing through singular configurations or scenarios demanding additional intermediary steps presents further challenges. Additionally, applications in motion transfer and learning-based models could benefit from the insights and tools developed in this paper.

The paper makes significant strides towards a comprehensive, unified solution for elastic object manipulation, with impressive speed and control versatility that suit a broad range of potential applications.

Youtube Logo Streamline Icon: https://streamlinehq.com