Sex-Prediction from Periocular Images across Multiple Sensors and Spectra

Tapia Farias Juan; Christian Ratghed; Christoph Busch

Keywords: biometrics, Deep learning, Gender Classification

Abstract

In this paper, we provide a comprehensive analysis of periocular-based sex-prediction (commonly referred to as gender classification) using state-of-the-art machine learning techniques. In order to reflect a more challenging scenario where periocular images are likely to be obtained from an unknown source, i.e. sensor, convolutional neural networks are trained on fused sets composed of several near-infrared (NIR) and visible wavelength (VW) image databases. In a cross- sensor scenario within each spectrum an average classification accuracy of approximately 85% is achieved. When sex-prediction is performed across spectra an average classification accuracy of about 82% is obtained. Finally, a multi-spectral sex-prediction yields a classification accuracy of 83% on average. Compared to proposed works, obtained results provide a more realistic estimation of the feasibility to predict a subject’s sex from the periocular region.

Más información

Fecha de publicación: 2019
Año de Inicio/Término: November 14-19
Página final: 529-535
Financiamiento/Sponsor: Fondecyt
URL: https://ieeexplore.ieee.org/document/8706207
Notas: 10.1109/SITIS.2018.00086.