MHCCL: Masked Hierarchical Cluster-Wise Contrastive Learning for Multivariate Time Series

Authors

  • Qianwen Meng Shandong University Joint SDU-NTU Centre for Artificial Intelligence Research (C-FAIR)
  • Hangwei Qian Lund University
  • Yong Liu Nanyang Technological University
  • Lizhen Cui Shandong University Joint SDU-NTU Centre for Artificial Intelligence Research (C-FAIR)
  • Yonghui Xu Shandong University Joint SDU-NTU Centre for Artificial Intelligence Research (C-FAIR)
  • Zhiqi Shen Nanyang Technological University

DOI:

https://doi.org/10.1609/aaai.v37i8.26098

Keywords:

ML: Unsupervised & Self-Supervised Learning, ML: Clustering, ML: Representation Learning, ML: Time-Series/Data Streams

Abstract

Learning semantic-rich representations from raw unlabeled time series data is critical for downstream tasks such as classification and forecasting. Contrastive learning has recently shown its promising representation learning capability in the absence of expert annotations. However, existing contrastive approaches generally treat each instance independently, which leads to false negative pairs that share the same semantics. To tackle this problem, we propose MHCCL, a Masked Hierarchical Cluster-wise Contrastive Learning model, which exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for multivariate time series. Motivated by the observation that fine-grained clustering preserves higher purity while coarse-grained one reflects higher-level semantics, we propose a novel downward masking strategy to filter out fake negatives and supplement positives by incorporating the multi-granularity information from the clustering hierarchy. In addition, a novel upward masking strategy is designed in MHCCL to remove outliers of clusters at each partition to refine prototypes, which helps speed up the hierarchical clustering process and improves the clustering quality. We conduct experimental evaluations on seven widely-used multivariate time series datasets. The results demonstrate the superiority of MHCCL over the state-of-the-art approaches for unsupervised time series representation learning.

Downloads

Published

2023-06-26

How to Cite

Meng, Q., Qian, H., Liu, Y., Cui, L., Xu, Y., & Shen, Z. (2023). MHCCL: Masked Hierarchical Cluster-Wise Contrastive Learning for Multivariate Time Series. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9153-9161. https://doi.org/10.1609/aaai.v37i8.26098

Issue

Section

AAAI Technical Track on Machine Learning III