Spatial deep feature augmentation technique for FER using genetic algorithm

Nida, Nudrat; Yousaf, Muhammad Haroon; Irtaza, Aun; Javed, Sajid; Velastin, Sergio A.

Abstract

The generation of a large human-labelled facial expression dataset is challenging due to ambiguity in labelling the facial expression class, and annotation cost. However, facial expression recognition (FER) systems demand discriminative feature representation, and require many training samples to establish stronger decision boundaries. Recently, FER approaches have used data augmentation techniques to increase the number of training samples for model generation. However, these augmented samples are derived from existing training data, and therefore have limitations for developing an accurate FER system. To achieve meaningful facial expression representations, we introduce an augmentation technique based on deep learning and genetic algorithms for FER. The proposed approach exploits the hypothesis that augmenting the feature-set is better than augmenting the visual data for FER. By evaluating this relationship, we discovered that the genetic evolution of discriminative features for facial expression is significant in developing a robust FER approach. In this approach, facial expression samples are generated from RGB visual data from videos considering human face frames as regions of interest. The face detected frames are further processed to extract key-frames within particular intervals. Later, these key-frames are convolved through a deep convolutional network for feature generation. A genetic algorithm's fitness function is gauged to select optimal genetically evolved deep facial expression receptive fields to represent virtual facial expressions. The extended facial expression information is evaluated through an extreme learning machine classifier. The proposed technique has been evaluated on five diverse datasets i.e. JAFFE, CK+, FER2013, AffectNet and our application-specific Instructor Facial Expression (IFEV) dataset. Experimentation results and analysis show the promising accuracy and significance of the proposed technique on all these datasets.

Más información

Título según WOS: ID WOS:001167400600009 Not found in local WOS DB
Título de la Revista: NEURAL COMPUTING & APPLICATIONS
Volumen: 36
Número: 9
Editorial: SPRINGER LONDON LTD
Fecha de publicación: 2024
Página de inicio: 4563
Página final: 4581
DOI:

10.1007/s00521-023-09245-x

Notas: ISI