Affinitention nets - kernel perspective on attention architectures for set classification with applications to medical text and images.

CHIL(2021)

引用 0|浏览22
暂无评分
摘要
Set classification is the task of predicting a single label from a set comprising multiple instances. The examples we consider are pathology slides represented by sets of patches and medical text data represented by sets of word embeddings. State-of-the-art methods, such as the transformer network, typically use attention mechanisms to learn representations of set data, by modeling interactions between instances of the set. These methods, however, have complex heuristic architectures comprising multiple heads and layers. The complexity of attention architectures hampers their training when only a small number of labeled sets is available, as is often the case in medical applications. To address this problem, we present a kernel-based representation learning framework that links learning affinity kernels to learning representations from attention architectures. We show that learning a combination of the sum and the product of kernels is equivalent to learning representations from multi-head multi-layer attention architectures. From our framework, we devise a simplified attention architecture which we term affinitention (affinity-attention) nets. We demonstrate the application of affinitention nets to the classification of the Set-Cifar10 dataset, thyroid malignancy prediction from pathology slides, as well as patient text-message triage. We show that affinitention nets provide competitive results compared to heuristic attention architectures and outperform other competing methods.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要