A Grassmannian Manifold Self-Attention Network for Signal Classification

A Grassmannian Manifold Self-Attention Network for Signal Classification

Rui Wang, Chen Hu, Ziheng Chen, Xiao-Jun Wu, Xiaoning Song

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 5099-5107. https://doi.org/10.24963/ijcai.2024/564

In the community of artificial intelligence, significant progress has been made in encoding sequential data using deep learning techniques. Nevertheless, how to effectively mine useful information from channel dimensions remains a major challenge, as these features have a submanifold structure. Linear subspace, the basic element of the Grassmannian manifold, has proven to be an effective manifold-valued feature descriptor in statistical representation. Besides, the Euclidean self-attention mechanism has shown great success in capturing long-range relationships of data. Inspired by these facts, we extend the self-attention mechanism to the Grassmannian manifold. Our framework can effectively characterize the spatiotemporal fluctuations of sequential data encoded in the Grassmannian manifold. Extensive experimental results on three benchmarking datasets (a drone recognition dataset and two EEG signal classification datasets) demonstrate the superiority of our method over the state-of-the-art. The code and supplementary material for this work can be found at https://github.com/ChenHu-ML/GDLNet.
Keywords:
Machine Learning: ML: Attention models
Machine Learning: ML: Classification
Machine Learning: ML: Geometric learning