TransFusion: Generating Long, High Fidelity Time Series using Diffusion Models with Transformers
arxiv(2023)
摘要
The generation of high-quality, long-sequenced time-series data is essential
due to its wide range of applications. In the past, standalone Recurrent and
Convolutional Neural Network-based Generative Adversarial Networks (GAN) were
used to synthesize time-series data. However, they are inadequate for
generating long sequences of time-series data due to limitations in the
architecture. Furthermore, GANs are well known for their training instability
and mode collapse problem. To address this, we propose TransFusion, a
diffusion, and transformers-based generative model to generate high-quality
long-sequence time-series data. We have stretched the sequence length to 384,
and generated high-quality synthetic data. Also, we introduce two evaluation
metrics to evaluate the quality of the synthetic data as well as its predictive
characteristics. We evaluate TransFusion with a wide variety of visual and
empirical metrics, and TransFusion outperforms the previous state-of-the-art by
a significant margin.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要