Cluster Distillation: Semi-supervised Time Series Classification through Clustering-based Self-supervision

Goyo, Manuel Alejandro; Ñanculef, Ricardo

Abstract

Time series have always raised great interest among scientists due to their multiple applications in real-world problems. In particular, time series classification using deep learning methods has recently attracted much attention and demonstrated remarkable performance. Unfortunately, most of the techniques studied so far assume that a fully-labeled dataset is available for training, a condition that limits the application of these methods in practice. In this paper, we present Cluster Distillation: a technique that leverages all the available data (labeled or unlabeled) for training a deep time series classifier. The method relies on a self-supervised mechanism that generates surrogate labels that guide learning when external supervisory signals are lacking. We create that mechanism by introducing clustering into a Knowledge Distillation framework in which a first neural net (the Teacher) transfers its beliefs about cluster memberships to a second neural net (the Student) which finally performs semi-supervised classification. Preliminary experiments in ten widely used datasets show that training a convolutional neural net (CNN) with the proposed technique leads to promising results, outperforming state-of-the-art methods in several relevant cases. The implementations are available on: ClusterDistillation

Más información

Título según SCOPUS: ID SCOPUS_ID:85146350881 Not found in local SCOPUS DB
Título de la Revista: 2018 37TH INTERNATIONAL CONFERENCE OF THE CHILEAN COMPUTER SCIENCE SOCIETY (SCCC)
Volumen: 2022-November
Editorial: IEEE
Fecha de publicación: 2022
Página de inicio: 1
Página final: 8
DOI:

10.1109/SCCC57464.2022.10000276

Notas: SCOPUS