TSLANet: Rethinking Transformers for Time Series Representation Learning
arxiv(2024)
摘要
Time series data, characterized by its intrinsic long and short-range
dependencies, poses a unique challenge across analytical applications. While
Transformer-based models excel at capturing long-range dependencies, they face
limitations in noise sensitivity, computational efficiency, and overfitting
with smaller datasets. In response, we introduce a novel Time Series
Lightweight Adaptive Network (TSLANet), as a universal convolutional model for
diverse time series tasks. Specifically, we propose an Adaptive Spectral Block,
harnessing Fourier analysis to enhance feature representation and to capture
both long-term and short-term interactions while mitigating noise via adaptive
thresholding. Additionally, we introduce an Interactive Convolution Block and
leverage self-supervised learning to refine the capacity of TSLANet for
decoding complex temporal patterns and improve its robustness on different
datasets. Our comprehensive experiments demonstrate that TSLANet outperforms
state-of-the-art models in various tasks spanning classification, forecasting,
and anomaly detection, showcasing its resilience and adaptability across a
spectrum of noise levels and data sizes. The code is available at
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要