The unseen Black faces of AI algorithms

19 oktober 2022
AI bias & discrimination
RH-thumbnail-02

Overzicht

"Four years after the release of an auditing tool that identified misclassification of dark-skinned faces, the study continues to shape research, regulation and commercial practices.

Joy Buolamwini and Timnit Gebru prompted a ground-breaking body of critical work that exposed bias, discrimination and oppressive nature of facial-analysis algorithms. This work remains an influential reference point to address developments of this technology and the threat they pose. Joy Buolamwini and her collaborator Timnit Gebru undertook a systematic audit of commercial facial-analysis systems, to highlight the ways in which these systems perform differently depending on the skin colour and gender of the person in the image.

This work became known as the Gender Shades audit and lead Buolamwini and Gebru to compile their own set of images - creating a more diverse and inclusive dataset that continues to impact and reshape data practices to date. "