Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 40 tok/s
GPT-5 High 38 tok/s Pro
GPT-4o 101 tok/s
GPT OSS 120B 470 tok/s Pro
Kimi K2 161 tok/s Pro
2000 character limit reached

Generating Synthetic Time Series Data for Cyber-Physical Systems (2404.08601v1)

Published 12 Apr 2024 in cs.LG

Abstract: Data augmentation is an important facilitator of deep learning applications in the time series domain. A gap is identified in the literature, demonstrating sparse exploration of the transformer, the dominant sequence model, for data augmentation in time series. A architecture hybridizing several successful priors is put forth and tested using a powerful time domain similarity metric. Results suggest the challenge of this domain, and several valuable directions for future work.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. A. Vaswani, N. M. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, “Attention is all you need,” in NIPS, 2017.
  2. G. Iglesias, E. Talavera, Á. González-Prieto, A. Mozo, and S. Gómez-Canaval, “Data augmentation techniques in time series domain: a survey and taxonomy,” Neural Computing and Applications, vol. 35, pp. 10123 – 10145, 2022.
  3. J. Yoon, “Supplementary materials: Time-series generative adversarial networks,” 2019.
  4. X. Li, V. Metsis, H. Wang, and A. H. H. Ngu, “Tts-gan: A transformer-based time-series generative adversarial network,” in Conference on Artificial Intelligence in Medicine in Europe, 2022.
  5. X. Li, A. H. H. Ngu, and V. Metsis, “Tts-cgan: A transformer time-series conditional gan for biosignal data augmentation,” ArXiv, vol. abs/2206.13676, 2022.
  6. Y. Jiang, S. Chang, and Z. Wang, “Transgan: Two pure transformers can make one strong gan, and that can scale up,” Advances in Neural Information Processing Systems, vol. 34, pp. 14745–14758, 2021.
  7. A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, and N. Houlsby, “An image is worth 16x16 words: Transformers for image recognition at scale,” ArXiv, vol. abs/2010.11929, 2020.
  8. C. Esteban, S. L. Hyland, and G. Rätsch, “Real-valued (medical) time series generation with recurrent conditional gans,” ArXiv, vol. abs/1706.02633, 2017.
  9. L. Xu and K. Veeramachaneni, “Synthesizing tabular data using generative adversarial networks,” ArXiv, vol. abs/1811.11264, 2018.
  10. S. Ahmed, I. E. Nielsen, A. Tripathi, S. Siddiqui, G. Rasool, and R. P. Ramachandran, “Transformers in time-series analysis: A tutorial,” ArXiv, vol. abs/2205.01138, 2022.
  11. H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong, and W. Zhang, “Informer: Beyond efficient transformer for long sequence time-series forecasting,” in AAAI Conference on Artificial Intelligence, 2020.
  12. S. Shao, P. Wang, and R. Yan, “Generative adversarial networks for data augmentation in machine fault diagnosis,” Comput. Ind., vol. 106, pp. 85–93, 2019.
  13. P. Nectoux, R. Gouriveau, K. Medjaher, E. Ramasso, B. Chebel-Morello, N. Zerhouni, and C. Varnier, “Pronostia : An experimental platform for bearings accelerated degradation tests.,” 2012.
  14. I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. C. Courville, and Y. Bengio, “Generative adversarial networks,” Commun. ACM, vol. 63, pp. 139–144, 2014.
  15. M. Mirza and S. Osindero, “Conditional generative adversarial nets,” ArXiv, vol. abs/1411.1784, 2014.
  16. M. Arjovsky, S. Chintala, and L. Bottou, “Wasserstein generative adversarial networks,” in International Conference on Machine Learning, 2017.
  17. W. Falcon and The PyTorch Lightning team, “PyTorch Lightning,” Mar. 2019.
  18. E. Cazelles, A. Robert, and F. A. Tobar, “The wasserstein-fourier distance for stationary time series,” IEEE Transactions on Signal Processing, vol. 69, pp. 709–721, 2019.
  19. I. Gulrajani, F. Ahmed, M. Arjovsky, V. Dumoulin, and A. C. Courville, “Improved training of wasserstein gans,” in Neural Information Processing Systems, 2017.
  20. R. Xiong, Y. Yang, D. He, K. Zheng, S. Zheng, C. Xing, H. Zhang, Y. Lan, L. Wang, and T.-Y. Liu, “On layer normalization in the transformer architecture,” ArXiv, vol. abs/2002.04745, 2020.
  21. W. Shi, J. Caballero, F. Huszár, J. Totz, A. P. Aitken, R. Bishop, D. Rueckert, and Z. Wang, “Real-time single image and video super-resolution using an efficient sub-pixel convolutional neural network,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1874–1883, 2016.
  22. J. Su, Y. Lu, S. Pan, B. Wen, and Y. Liu, “Roformer: Enhanced transformer with rotary position embedding,” ArXiv, vol. abs/2104.09864, 2021.
  23. C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the inception architecture for computer vision,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2818–2826, 2015.
  24. T. Salimans, I. J. Goodfellow, W. Zaremba, V. Cheung, A. Radford, and X. Chen, “Improved techniques for training gans,” ArXiv, vol. abs/1606.03498, 2016.
  25. C. R. Harris, K. J. Millman, S. J. van der Walt, R. Gommers, P. Virtanen, D. Cournapeau, E. Wieser, J. Taylor, S. Berg, N. J. Smith, R. Kern, M. Picus, S. Hoyer, M. H. van Kerkwijk, M. Brett, A. Haldane, J. F. del Río, M. Wiebe, P. Peterson, P. Gérard-Marchant, K. Sheppard, T. Reddy, W. Weckesser, H. Abbasi, C. Gohlke, and T. E. Oliphant, “Array programming with NumPy,” Nature, vol. 585, pp. 357–362, Sept. 2020.
  26. Z. Han, C. Shen, Y. Zhang, H. Wang, and L. Yu, “Data-driven fault detection of rotating machinery using synthetic oversampling and generative adversarial network,” in Workshop on Electronics Communication Engineering, 2023.
  27. Y. Sun, T. Zhao, Z. Zou, Y. Chen, and H. Zhang, “Imbalanced data fault diagnosis of hydrogen sensors using deep convolutional generative adversarial network with convolutional neural network.,” The Review of scientific instruments, vol. 92 9, p. 095007, 2021.
  28. J. Wang, S. Li, B. Han, Z. An, H. Bao, and S. Ji, “Generalization of deep neural networks for imbalanced fault classification of machinery using generative adversarial networks,” IEEE Access, vol. 7, pp. 111168–111180, 2019.
  29. J. Luo, L. Zhu, Q. Li, D. Liu, and M. Chen, “Imbalanced fault diagnosis of rotating machinery based on deep generative adversarial networks with gradient penalty,” Processes, 2021.
  30. H. Petroski, To Engineer Is Human: The Role of Failure in Successful Design. New York, N.Y.: St. Martin’s Press, 1985.
  31. S. S. Narasimhan, S. Agarwal, O. Akcin, S. Sanghavi, and S. Chinchali, “Time weaver: A conditional time series generation model,” 2024.
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets