Explaining the absorption features of deep learning hyperspectral classification models

Arthur Vandenhoeke, Lennert Antson, Guillem Ballesteros,Jonathan Crabbe,Michal Shimoni

IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM(2023)

引用 0|浏览2
暂无评分
摘要
Over the past decade, Deep Learning (DL) models have proven to be efficient at classifying remotely sensed Earth Observation (EO) hyperspectral imaging (HSI) data. Those models show state-of-the-art performances across various bench-marked data sets by extracting abstract spatial-spectral features using 2D and 3D convolutions. However, the black-box nature of DL models hinders explanation, limits trust, and underscores the need for profound insights beyond raw performance metrics. In this contribution, we implement a simple yet powerful mechanism for the explainability of DL-based absorption features using an axiomatic approach called Integrated Gradients, and showcase how such an approach can be used to evaluate the relevance of a network's decisions, and compare network sensitivities when trained using single and dual sensor data.
更多
查看译文
关键词
Earth observation,explainable AI,integrated gradients,neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要