Chrome Extension
WeChat Mini Program
Use on ChatGLM

Hydra Attention: Efficient Attention with Many Heads

arxiv(2022)

Cited 16|Views161
No score
Key words
Vision Transformers, Attention, Token efficiency
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined