Papers
Topics
Authors
Recent
2000 character limit reached

PMT-MAE: Dual-Branch Self-Supervised Learning with Distillation for Efficient Point Cloud Classification (2409.02007v2)

Published 3 Sep 2024 in cs.CV

Abstract: Advances in self-supervised learning are essential for enhancing feature extraction and understanding in point cloud processing. This paper introduces PMT-MAE (Point MLP-Transformer Masked Autoencoder), a novel self-supervised learning framework for point cloud classification. PMT-MAE features a dual-branch architecture that integrates Transformer and MLP components to capture rich features. The Transformer branch leverages global self-attention for intricate feature interactions, while the parallel MLP branch processes tokens through shared fully connected layers, offering a complementary feature transformation pathway. A fusion mechanism then combines these features, enhancing the model's capacity to learn comprehensive 3D representations. Guided by the sophisticated teacher model Point-M2AE, PMT-MAE employs a distillation strategy that includes feature distillation during pre-training and logit distillation during fine-tuning, ensuring effective knowledge transfer. On the ModelNet40 classification task, achieving an accuracy of 93.6\% without employing voting strategy, PMT-MAE surpasses the baseline Point-MAE (93.2\%) and the teacher Point-M2AE (93.4\%), underscoring its ability to learn discriminative 3D point cloud representations. Additionally, this framework demonstrates high efficiency, requiring only 40 epochs for both pre-training and fine-tuning. PMT-MAE's effectiveness and efficiency render it well-suited for scenarios with limited computational resources, positioning it as a promising solution for practical point cloud analysis.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)
  1. doi:10.18653/v1/N19-1423. URL https://aclanthology.org/N19-1423
  2. doi:10.1109/CVPR46437.2021.01549.
  3. doi:10.1109/CVPR52688.2022.01553.
  4. doi:10.1109/CVPR52729.2023.02085.
  5. doi:10.24963/ijcai.2023/88. URL https://doi.org/10.24963/ijcai.2023/88
  6. doi:10.1609/aaai.v38i7.28522. URL https://ojs.aaai.org/index.php/AAAI/article/view/28522
  7. doi:10.18653/v1/2020.findings-emnlp.372. URL https://aclanthology.org/2020.findings-emnlp.372
  8. doi:10.1109/CVPR.2017.16.
  9. doi:10.1109/ICCV48922.2021.00492.
  10. doi:10.1109/ICCV48922.2021.01595.
  11. doi:10.1609/aaai.v38i7.28559. URL https://ojs.aaai.org/index.php/AAAI/article/view/28559
  12. doi:10.1109/CVPR52688.2022.00967.
  13. doi:10.1109/CVPR52688.2022.01871.
  14. arXiv:2408.05508. URL https://arxiv.org/abs/2408.05508
  15. doi:10.1109/ICCV.2019.00651.
  16. doi:10.1109/TMM.2023.3284591.
  17. doi:10.1109/TMM.2023.3317998.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.