Multi-scale Geometry-aware Self-Attention for 3D Point Cloud Classification

Cui Heng, Zhang Haitao,Yang Jian,Du Baochang, Jia Yuhang

2023 IEEE 3rd International Conference on Electronic Technology, Communication and Information (ICETCI)(2023)

引用 0|浏览8
暂无评分
摘要
As an important form of geometric data, effectively extracting the rich geometric information contained within point clouds is integral to addressing various point cloud data processing problems. However, point cloud objects typically possess non-Euclidean spatial structures with multiple scales that are disordered, complex, and dynamically and unpredictably changing. Most current self-attention modules rely on query-key-value features' dot-product multiplication and dimension alignment, which cannot capture the multi-scale non-Euclidean structures of point cloud objects. This paper presents a self-attention plug-in module, Multi-scale Geometry-aware Self-Attention (MGSA), and its variants to address these challenges. By utilizing multi-scale local and global geometric information, the MGSA method effectively processes point cloud data. The experimental results showcase that incorporating the self-attention mechanism significantly enhances the ability to capture multi-scale geometry. Furthermore, MGSA achieves consistently strong results on well-known point cloud benchmarks, demonstrating its robust competitiveness.
更多
查看译文
关键词
attention mechanism,3D point cloud,Transformer,point cloud classification,deep neural network,self-attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要