TIM: An Efficient Temporal Interaction Module for Spiking Transformer
TIM: An Efficient Temporal Interaction Module for Spiking Transformer
Sicheng Shen, Dongcheng Zhao, Guobin Shen, Yi Zeng
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 3133-3141.
https://doi.org/10.24963/ijcai.2024/347
Spiking Neural Networks (SNNs), as the third generation of neural networks, have gained prominence for their biological plausibility and computational efficiency, especially in processing diverse datasets. The integration of attention mechanisms, inspired by advancements in neural network architectures, has led to the development of Spiking Transformers. These have shown promise in enhancing SNNs' capabilities, particularly in the realms of both static and neuromorphic datasets. Despite their progress, a discernible gap exists in these systems, specifically in the Spiking Self Attention (SSA) mechanism's effectiveness in leveraging the temporal processing potential of SNNs. To address this, we introduce the Temporal Interaction Module (TIM), a novel, convolution-based enhancement designed to augment the temporal data processing abilities within SNN architectures. TIM's integration into existing SNN frameworks is seamless and efficient, requiring minimal additional parameters while significantly boosting their temporal information handling capabilities. Through rigorous experimentation, TIM has demonstrated its effectiveness in exploiting temporal information, leading to state-of-the-art performance across various neuromorphic datasets. The code is available at https://github.com/BrainCog-X/Brain-Cog/tree/main/examples/TIM.
Keywords:
Humans and AI: HAI: Cognitive modeling
Computer Vision: CV: Recognition (object detection, categorization)
Computer Vision: CV: Representation learning
Humans and AI: HAI: Cognitive systems