Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cross-Level Cross-Scale Cross-Attention Network for Point Cloud Representation (2104.13053v1)

Published 27 Apr 2021 in cs.CV and cs.MM

Abstract: Self-attention mechanism recently achieves impressive advancement in NLP and Image Processing domains. And its permutation invariance property makes it ideally suitable for point cloud processing. Inspired by this remarkable success, we propose an end-to-end architecture, dubbed Cross-Level Cross-Scale Cross-Attention Network (CLCSCANet), for point cloud representation learning. First, a point-wise feature pyramid module is introduced to hierarchically extract features from different scales or resolutions. Then a cross-level cross-attention is designed to model long-range inter-level and intra-level dependencies. Finally, we develop a cross-scale cross-attention module to capture interactions between-and-within scales for representation enhancement. Compared with state-of-the-art approaches, our network can obtain competitive performance on challenging 3D object classification, point cloud segmentation tasks via comprehensive experimental evaluation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Xian-Feng Han (7 papers)
  2. Zhang-Yue He (1 paper)
  3. Jia Chen (85 papers)
  4. Guo-Qiang Xiao (4 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.