BDANet: Multiscale Convolutional Neural Network With Cross-Directional Attention for Building Damage Assessment From Satellite Images

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING(2022)

引用 30|浏览85
暂无评分
摘要
Fast and effective responses are required when a natural disaster (e.g., earthquake and hurricane) strikes. Building damage assessment from satellite imagery is critical before relief effort is deployed. With a pair of predisaster and postdisaster satellite images, building damage assessment aims at predicting the extent of damage to buildings. With the powerful ability of feature representation, deep neural networks have been successfully applied to building damage assessment. Most existing works simply concatenate predisaster and postdisaster images as input of a deep neural network without considering their correlations. In this article, we propose a novel two-stage convolutional neural network for building damage assessment, called BDANet. In the first stage, a U-Net is used to extract the locations of buildings. Then, the network weights from the first stage are shared in the second stage for building damage assessment. In the second stage, a two-branch multiscale U-Net is employed as the backbone, where predisaster and postdisaster images are fed into the network separately. A cross-directional attention module is proposed to explore the correlations between predisaster and postdisaster images. Moreover, CutMix data augmentation is exploited to tackle the challenge of difficult classes. The proposed method achieves state-of-the-art performance on a large-scale dataset--xBD. The code is available at https://github.com/ShaneShen/BDANet-Building-Damage-Assessment.
更多
查看译文
关键词
Buildings, Image segmentation, Satellites, Feature extraction, Task analysis, Neural networks, Remote sensing, Building damage assessment, convolutional neural network (CNN), cross-directional attention (CDA), CutMix, multiscale feature fusion (MFF), satellite image
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要