SATS: Self-attention transfer for continual semantic segmentation
Pattern Recognition(2023)
摘要
•Propose distilling self-attention maps for continual semantic segmentation.•Propose a class-specific region pooling for relational knowledge transfer.•First study to innovatively apply Transformer to continual semantic segmentation.•Can flexibly combine the proposed method with existing strategies.•Extensive evaluation on multiple benchmarks and settings with SOTA performance.
更多查看译文
关键词
Continual learning,Semantic segmentation,Self-attention transfer,Class-specific region pooling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要