Exploring the Influence of Dimensionality Reduction on Anomaly Detection Performance in Multivariate Time Series
CoRR(2024)
摘要
This paper presents an extensive empirical study on the integration of
dimensionality reduction techniques with advanced unsupervised time series
anomaly detection models, focusing on the MUTANT and Anomaly-Transformer
models. The study involves a comprehensive evaluation across three different
datasets: MSL, SMAP, and SWaT. Each dataset poses unique challenges, allowing
for a robust assessment of the models' capabilities in varied contexts. The
dimensionality reduction techniques examined include PCA, UMAP, Random
Projection, and t-SNE, each offering distinct advantages in simplifying
high-dimensional data. Our findings reveal that dimensionality reduction not
only aids in reducing computational complexity but also significantly enhances
anomaly detection performance in certain scenarios. Moreover, a remarkable
reduction in training times was observed, with reductions by approximately
300% and 650% when dimensionality was halved and minimized to the lowest
dimensions, respectively. This efficiency gain underscores the dual benefit of
dimensionality reduction in both performance enhancement and operational
efficiency. The MUTANT model exhibits notable adaptability, especially with
UMAP reduction, while the Anomaly-Transformer demonstrates versatility across
various reduction techniques. These insights provide a deeper understanding of
the synergistic effects of dimensionality reduction and anomaly detection,
contributing valuable perspectives to the field of time series analysis. The
study underscores the importance of selecting appropriate dimensionality
reduction strategies based on specific model requirements and dataset
characteristics, paving the way for more efficient, accurate, and scalable
solutions in anomaly detection.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要