A novel Capsule Neural Network based model for drowsiness detection using electroencephalography signals
Abstract
The early detection of drowsiness has become vital to ensure the correct and safe development of several industries' tasks. Due to the transient mental state of a human subject between alertness and drowsiness, automated drowsiness detection is a complex problem to tackle. The electroencephalography signals allow us to record variations in an individual's brain's electrical potential, where each of them gives specific information about a subject's mental state. However, due to this type of signal's nature, its acquisition, in general, is complex, so it is hard to have a large volume of data to apply techniques of Deep Learning for processing and classification optimally. Nevertheless, Capsule Neural Networks are a brand-new Deep Learning algorithm proposed for work with reduced amounts of data. It is a robust algorithm to handle the data's hierarchical relationships, which is an essential characteristic for work with biomedical signals. Therefore, this paper presents a Deep Learning-based method for drowsiness detection with CapsNet by using a concatenation of spectrogram images of the electroencephalography signals channels. The proposed CapsNet model is compared with a Convolutional Neural Network, which is outperformed by the proposed model, which obtains an average accuracy of 86,44 % and 87,57% of sensitivity against an average accuracy of 75,86% and 79,47% sensitivity for the CNN, showing that CapsNet is more suitable for this kind of datasets and tasks.
Más información
Título según WOS: | A novel Capsule Neural Network based model for drowsiness detection using electroencephalography signals |
Título según SCOPUS: | A novel Capsule Neural Network based model for drowsiness detection using electroencephalography signals |
Título de la Revista: | EXPERT SYSTEMS WITH APPLICATIONS |
Volumen: | 201 |
Editorial: | PERGAMON-ELSEVIER SCIENCE LTD |
Fecha de publicación: | 2022 |
DOI: |
10.1016/J.ESWA.2022.116977 |
Notas: | ISI, SCOPUS |