An Empirical Study on Group Fairness Metrics of Judicial Data

Group fairness means that different groups have an equal probability of being predicted for one aspect. It is a significant fairness definition, which is conducive to maintaining social harmony and stability. Fairness is a vital issue when an artificial intelligence software system is used to make j...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Yanjun Li, Huan Huang, Xinwei Guo, Yuyu Yuan
Formato: article
Lenguaje:EN
Publicado: IEEE 2021
Materias:
Acceso en línea:https://doaj.org/article/323a755c0f0e469d89410f64af96cce6
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Sumario:Group fairness means that different groups have an equal probability of being predicted for one aspect. It is a significant fairness definition, which is conducive to maintaining social harmony and stability. Fairness is a vital issue when an artificial intelligence software system is used to make judicial decisions. Either data or algorithm alone may lead to unfair results. Determining the fairness of the dataset is a prerequisite for studying the fairness of algorithms. This paper focuses on the dataset to research group fairness from both micro and macro views. We propose a framework to determine the sensitive attributes of a dataset and metrics to measure the fair degree of sensitive attributes. We conducted experiments and statistical analysis of the judicial data to demonstrate the framework and metric approach better. The framework and metric approach can be applied to datasets of other domains, providing persuasive evidence for the effectiveness and availability of algorithmic fairness research. It opens up a new way for the research of the fairness of the dataset.