谷歌浏览器插件
订阅小程序
在清言上使用

Narrowing Attention in Capsule Networks

2022 26th International Conference on Pattern Recognition (ICPR)(2022)

引用 0|浏览6
暂无评分
摘要
Despite their recent success capsule networks (CapsNets) are still very computationally intensive and fail to achieve state-of-the-art performances on advanced datasets. As a consequence CapsNets are usually combined with additional conventional feature extraction layers to solve complex tasks. Based on the hypothesis that more efficient and distinct routing can alleviate these drawbacks, we propose a novel CapsNet algorithm, which utilises narrowed attention to determine the coupling coefficients between lower and higher level capsules. In particular, we employ tiny subnetworks with sigmoid activation functions to enforce concise routing decisions, thus reducing the tendency of CapsNets to explain the entire image rather than focusing on the essential information for a given task. This non-iterative routing strategy is computationally fast and memory efficient, results in interpretable coupling decisions and can be easily integrated into existing models due to its strong alignment with capsule theory. In addition, these solely capsule-based models are robust to a wide range of image transformations, have stable convergence characteristics and can be further improved by capsule-specific yet straightforward applications of dropout and batch normalisation. In a series of experiments, we demonstrate that narrowed attention routing enables the training of deep capsule networks without the need for additional feature extraction layers, while outperforming existing CapsNet architectures on a variety of well-known benchmark datasets.
更多
查看译文
关键词
additional feature extraction layers,advanced datasets,capsule networks,capsule theory,capsule-based models,capsule-specific,concise routing decisions,consequence CapsNets,coupling coefficients,deep capsule networks,distinct routing,efficient routing,existing CapsNet architectures,feature extraction layers,higher level capsules,image transformations,interpretable coupling decisions,lower level capsules,memory efficient,narrowed attention routing,noniterative routing strategy,novel CapsNet algorithm,sigmoid activation functions,stable convergence characteristics,tiny subnetworks,well-known benchmark datasets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要