Skip to main content

2024 | OriginalPaper | Buchkapitel

Learning Disentangled Task-Related Representation for Time Series

verfasst von : Liping Hou, Lemeng Pan, Yicheng Guo, Cheng Li, Lihao Zhang

Erschienen in: Advances in Knowledge Discovery and Data Mining

Verlag: Springer Nature Singapore

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Multivariate time series representation learning employs unsupervised tasks to extract meaningful representations from time series data, enabling their application in diverse downstream tasks. However, despite the promising advancements in contrastive learning-based representation learning, the study of task-related feature learning is still in its early stages. This gap exists because current unified representation learning frameworks lack the ability to effectively disentangle task-related features. To address this limitation, we propose DisT, a novel contrastive learning-based method for efficient task-related feature learning in time series representation. DisT disentangles task-related features by incorporating feature network structure learning and contrastive sample pair selection. Specifically, DisT incorporates a feature decoupling module, which prioritizes global features for time series classification tasks, while emphasizing periodic and seasonal features for forecasting tasks. Additionally, DisT leverages contrastive loss and task-related feature loss to adaptively select data augmentation methods, preserving task-relevant shared information between positive samples across different data and tasks. Experimental results on various multivariate time-series datasets including classification and forecasting tasks show that DisT achieves state-of-the-art performance.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
2.
Zurück zum Zitat Bai, S., Kolter, J.Z., Koltun, V.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271 (2018) Bai, S., Kolter, J.Z., Koltun, V.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:​1803.​01271 (2018)
3.
Zurück zum Zitat Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRef Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRef
4.
Zurück zum Zitat Cleveland, R.B., Cleveland, W.S., McRae, J.E., Terpenning, I.: STL: a seasonal-trend decomposition. J. Off. Stat 6(1), 3–73 (1990) Cleveland, R.B., Cleveland, W.S., McRae, J.E., Terpenning, I.: STL: a seasonal-trend decomposition. J. Off. Stat 6(1), 3–73 (1990)
5.
Zurück zum Zitat Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)MathSciNet Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)MathSciNet
6.
Zurück zum Zitat Eldele, E., et al.: Time-series representation learning via temporal and contextual contrasting (2021) Eldele, E., et al.: Time-series representation learning via temporal and contextual contrasting (2021)
7.
Zurück zum Zitat Franceschi, J.-Y., Dieuleveut, A., Jaggi, M.: Unsupervised scalable representation learning for multivariate time series. Adv. Neural Inf. Process. Syst. 32 (2019) Franceschi, J.-Y., Dieuleveut, A., Jaggi, M.: Unsupervised scalable representation learning for multivariate time series. Adv. Neural Inf. Process. Syst. 32 (2019)
8.
Zurück zum Zitat Harutyunyan, H., Khachatrian, H., Kale, D.C., Ver Steeg, G., Galstyan, A.: Multitask learning and benchmarking with clinical time series data. Scientific Data 6(1), 96 (2019)CrossRef Harutyunyan, H., Khachatrian, H., Kale, D.C., Ver Steeg, G., Galstyan, A.: Multitask learning and benchmarking with clinical time series data. Scientific Data 6(1), 96 (2019)CrossRef
9.
Zurück zum Zitat He, K., Fan, H., Wu, Y., Xie, S., Girshick, R.: Momentum contrast for unsupervised visual representation learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9729–9738 (2020) He, K., Fan, H., Wu, Y., Xie, S., Girshick, R.: Momentum contrast for unsupervised visual representation learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9729–9738 (2020)
10.
Zurück zum Zitat Lai, G., Chang, W.-C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 95–104 (2018) Lai, G., Chang, W.-C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 95–104 (2018)
11.
Zurück zum Zitat Le Guennec, A., Malinowski, S., Tavenard, R.: Data augmentation for time series classification using convolutional neural networks. In: ECML/PKDD Workshop on Advanced Analytics and Learning on Temporal Data (2016) Le Guennec, A., Malinowski, S., Tavenard, R.: Data augmentation for time series classification using convolutional neural networks. In: ECML/PKDD Workshop on Advanced Analytics and Learning on Temporal Data (2016)
12.
Zurück zum Zitat Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Adv. Neural Inf. Process. Syst. 32 (2019) Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Adv. Neural Inf. Process. Syst. 32 (2019)
13.
Zurück zum Zitat Li, Y., Hu, G., Wang, Y., Hospedales, T., Robertson, N.M., Yang, Y.: Dada: differentiable automatic data augmentation. arXiv preprint arXiv:2003.03780 (2020) Li, Y., Hu, G., Wang, Y., Hospedales, T., Robertson, N.M., Yang, Y.: Dada: differentiable automatic data augmentation. arXiv preprint arXiv:​2003.​03780 (2020)
14.
Zurück zum Zitat Luo, D., et al.: Time series contrastive learning with information-aware augmentations. Proc. AAAI Conf. Artif. Intell. 37, 4534–4542 (2023) Luo, D., et al.: Time series contrastive learning with information-aware augmentations. Proc. AAAI Conf. Artif. Intell. 37, 4534–4542 (2023)
15.
Zurück zum Zitat van den Oord, A., Li, Y., Vinyals, O.: Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748 (2018) van den Oord, A., Li, Y., Vinyals, O.: Representation learning with contrastive predictive coding. arXiv preprint arXiv:​1807.​03748 (2018)
16.
Zurück zum Zitat Theiler, J., Eubank, S., Longtin, A., Galdrikian, B., Farmer, J.D.: Testing for nonlinearity in time series: the method of surrogate data. Physica D: Nonl. Phenom. 58(1–4), 77–94 (1992)CrossRef Theiler, J., Eubank, S., Longtin, A., Galdrikian, B., Farmer, J.D.: Testing for nonlinearity in time series: the method of surrogate data. Physica D: Nonl. Phenom. 58(1–4), 77–94 (1992)CrossRef
17.
Zurück zum Zitat Tian, Y., Sun, C., Poole, B., Krishnan, D., Schmid, C., Isola, P.: What makes for good views for contrastive learning? Adv. Neural. Inf. Process. Syst. 33, 6827–6839 (2020) Tian, Y., Sun, C., Poole, B., Krishnan, D., Schmid, C., Isola, P.: What makes for good views for contrastive learning? Adv. Neural. Inf. Process. Syst. 33, 6827–6839 (2020)
18.
Zurück zum Zitat Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. In: International Conference on Learning Representations Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. In: International Conference on Learning Representations
19.
Zurück zum Zitat Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. arXiv preprint arXiv:2106.00750 (2021) Tonekaboni, S., Eytan, D., Goldenberg, A.: Unsupervised representation learning for time series with temporal neighborhood coding. arXiv preprint arXiv:​2106.​00750 (2021)
20.
Zurück zum Zitat Van der Maaten, L., Hinton, G.: Visualizing data using t-sne. J. Mach. Learn. Res. 9(11) (2008) Van der Maaten, L., Hinton, G.: Visualizing data using t-sne. J. Mach. Learn. Res. 9(11) (2008)
21.
Zurück zum Zitat Wang, H., Guo, X., Deng, Z.-H., Lu, Y.: Rethinking minimal sufficient representation in contrastive learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16041–16050 (2022) Wang, H., Guo, X., Deng, Z.-H., Lu, Y.: Rethinking minimal sufficient representation in contrastive learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16041–16050 (2022)
22.
Zurück zum Zitat Wang, Z., Xovee, X., Zhang, W., Trajcevski, G., Zhong, T., Zhou, F.: Learning latent seasonal-trend representations for time series forecasting. Adv. Neural. Inf. Process. Syst. 35, 38775–38787 (2022) Wang, Z., Xovee, X., Zhang, W., Trajcevski, G., Zhong, T., Zhou, F.: Learning latent seasonal-trend representations for time series forecasting. Adv. Neural. Inf. Process. Syst. 35, 38775–38787 (2022)
23.
Zurück zum Zitat Woo, G., Liu, C., Sahoo, D., Kumar, A., Hoi, S.: Cost: contrastive learning of disentangled seasonal-trend representations for time series forecasting. arXiv preprint arXiv:2202.01575 (2022) Woo, G., Liu, C., Sahoo, D., Kumar, A., Hoi, S.: Cost: contrastive learning of disentangled seasonal-trend representations for time series forecasting. arXiv preprint arXiv:​2202.​01575 (2022)
24.
Zurück zum Zitat Yu, H., Yang, H., Sano, A.: Leaves: learning views for time-series data in contrastive learning. arXiv preprint arXiv:2210.07340 (2022) Yu, H., Yang, H., Sano, A.: Leaves: learning views for time-series data in contrastive learning. arXiv preprint arXiv:​2210.​07340 (2022)
25.
Zurück zum Zitat Yue, Z., et al.: Ts2vec: towards universal representation of time series. Proc. AAAI Conf. Artif. Intell. 36, 8980–8987 (2022) Yue, Z., et al.: Ts2vec: towards universal representation of time series. Proc. AAAI Conf. Artif. Intell. 36, 8980–8987 (2022)
26.
Zurück zum Zitat Zeng, A., Chen, M., Zhang, L., Qiang, X.: Are transformers effective for time series forecasting? Proc. AAAI Conf. Artif. Intell. 37, 11121–11128 (2023) Zeng, A., Chen, M., Zhang, L., Qiang, X.: Are transformers effective for time series forecasting? Proc. AAAI Conf. Artif. Intell. 37, 11121–11128 (2023)
27.
Zurück zum Zitat Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A., Eickhoff, C.: A transformer-based framework for multivariate time series representation learning. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 2114–2124 (2021) Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A., Eickhoff, C.: A transformer-based framework for multivariate time series representation learning. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 2114–2124 (2021)
28.
Zurück zum Zitat Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. Proc. AAAI Conf. Artif. Intell. 35, 11106–11115 (2021) Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. Proc. AAAI Conf. Artif. Intell. 35, 11106–11115 (2021)
Metadaten
Titel
Learning Disentangled Task-Related Representation for Time Series
verfasst von
Liping Hou
Lemeng Pan
Yicheng Guo
Cheng Li
Lihao Zhang
Copyright-Jahr
2024
Verlag
Springer Nature Singapore
DOI
https://doi.org/10.1007/978-981-97-2266-2_18

Premium Partner