Publication:
Gender stereotyping impact in facial expression recognition

Date

2023

Director

Publisher

Springer
Acceso abierto / Sarbide irekia
Contribución a congreso / Biltzarrerako ekarpena
Versión aceptada / Onetsi den bertsioa

Project identifier

AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2020-118014RB-I00/ES/recolecta
Gobierno de Navarra//0011-1411-2020-000079

Abstract

Facial Expression Recognition (FER) uses images of faces to identify the emotional state of users, allowing for a closer interaction between humans and autonomous systems. Unfortunately, as the images naturally integrate some demographic information, such as apparent age, gender, and race of the subject, these systems are prone to demographic bias issues. In recent years, machine learning-based models have become the most popular approach to FER. These models require training on large datasets of facial expression images, and their generalization capabilities are strongly related to the characteristics of the dataset. In publicly available FER datasets, apparent gender representation is usually mostly balanced, but their representation in the individual label is not, embedding social stereotypes into the datasets and generating a potential for harm. Although this type of bias has been overlooked so far, it is important to understand the impact it may have in the context of FER. To do so, we use a popular FER dataset, FER+, to generate derivative datasets with different amounts of stereotypical bias by altering the gender proportions of certain labels. We then proceed to measure the discrepancy between the performance of the models trained on these datasets for the apparent gender groups. We observe a discrepancy in the recognition of certain emotions between genders of up to 29 % under the worst bias conditions. Our results also suggest a safety range for stereotypical bias in a dataset that does not appear to produce stereotypical bias in the resulting model. Our findings support the need for a thorough bias analysis of public datasets in problems like FER, where a global balance of demographic representation can still hide other types of bias that harm certain demographic groups.

Description

Keywords

Facial expression recognition, Gender stereotyping, Stereotypical bias

Department

Estadística, Informática y Matemáticas / Estatistika, Informatika eta Matematika / Institute of Smart Cities - ISC

Faculty/School

Degree

Doctorate program

item.page.cita

Dominguez-Catena, I., Paternain, D., & Galar, M. (2023). Gender stereotyping impact in facial expression recognition. En I. Koprinska, P. Mignone, R. Guidotti, S. Jaroszewicz, H. Fröning, F. Gullo, P. M. Ferreira, D. Roqueiro, G. Ceddia, S. Nowaczyk, J. Gama, R. Ribeiro, R. Gavaldà, E. Masciari, Z. Ras, E. Ritacco, F. Naretto, A. Theissler, P. Biecek, … S. Pashami (Eds.), Machine Learning and Principles and Practice of Knowledge Discovery in Databases (Vol. 1752, pp. 9-22). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-23618-1_1

item.page.rights

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG.

Los documentos de Academica-e están protegidos por derechos de autor con todos los derechos reservados, a no ser que se indique lo contrario.