Bias assessment for experts in discrimination, not in computer science.
Abstract
Approaches to bias assessment usually require such technical skills that, by design,they leave discrimination experts out. In this paper we present EDIA, a tool that facilitates that experts in discrimination explore social biases in word embeddings and masked language models. Experts can then characterize those biases so that their presence can be assessed more systematically, and actions can be planned to address them. They can work interactively to assess the effects of different characterizations of bias in a given word embedding or language model, which helps to specify informal intuitions in concrete resources for systematic testing.
Más información
Título de la Revista: | EACL 2023 Cross-cultural considerations in NLP (in press). |
Fecha de publicación: | 2023 |