QKFormer: Hierarchical Spiking Transformer using Q-K Attention
Paper Info
| Field | Content |
|---|---|
| Title | QKFormer: Hierarchical Spiking Transformer using Q-K Attention |
| Authors | Chenlin Zhou et al. |
| Venue | NeurIPS 2024 |
| Year | 2024 |
| Link | arxiv |
Summary
QK-Attention을 Spikeformer에 적용한 논문
Problem Statement
####
Key Idea
Methods
Overall Architecture
Key Equations
Results
Discussion
Insights
References
-