Improving autoencoder by mutual information maximization and shuffle attention for novelty detection

Applied Intelligence(2023)

Cited 1|Views6
No score
Abstract
Under an open dynamic environment, a challenging task in object detection is to determine whether samples belong to a known class. Novelty detection can be exploited to identify classes that have not appeared in training process, that is, unknown classes. Current methods mainly adopt autoencoder (AE) to model inlier samples to generate reconstructions of specified categories and distinguish them from outlier samples by the reconstruction error. However, the AE generalizes well to construct images outside of the distribution of the training data, and it makes the model challenging to differentiate inlier samples from outlier samples. To this end, we propose a novelty detection model based on shuffle attention mechanism and mutual information maximization (MIM) to modify the effect of traditional AE on the reconstruction of inlier and outlier samples. Firstly, the rotated inlier samples are reconstructed and classified to enhance the mutual information between latent codes and inlier samples, thus constraining the representation of the latent space. Subsequently, the efficient shuffle attention mechanism is introduced to enable the model to focus more on inlier representation with negligible computation. Experimental results on four public datasets verify the potential performance of the proposed method for novelty detection.
More
Translated text
Key words
Novelty detection,Mutual information,Shuffle attention,Adversarial learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined