Are All Equal in the Eyes of Artificial Intelligence?

By Chenai Chair | Feb. 24, 2020 | Fellowships & Awards

Johannesburg, Gauteng province: Minibus taxis jam Klein Street in the city centre. Johannesburg, Gauteng province: Minibus taxis jam Klein Street in the city centre. Photo: Chris Kirchhoff

Chenai Chair is a 2019-2020 Mozilla Fellow and Research Manager: Gender and Digital Rights at World Wide Web Foundation


Reframing South Africa's Fourth Industrial Revolution (4thIR) conversation for data rights

#DataMustFall has been one of the most prolific movements in South Africa for years. At its core is the need for mobile network operators to reduce overpriced mobile data for basic internet access. Over half of the population uses the internet and 47% of people own smartphones in South Africa.

Now, there is a new data issue requiring as much public attention as #DataMustFall: data rights. Recently, one of the country's major banks experienced a data breach where personal information such as names, home addresses, telephone numbers and email addresses were accessed. 1.7 million accounts were impacted. Personal identifying information was compromised — our data. Which raises the question: What control do we have over our data as personal information is collected, stored and processed in the age of Artificial Intelligence (AI)?

What control do we have over our data as personal information is collected, stored and processed in the age of Artificial Intelligence (AI)?

~

Data rights should be at the forefront of public conversations now, and also the focus of the South African government’s Fourth Industrial Revolution (4thIR) work. 4thIR is about leveraging data-driven technological development — such as automation, machine learning, and AI — to advance development and growth. Minister Stella Ndabeni-Abrahams has stated a need for upskilling so people are ready to participate meaningfully, and so people better understand that jobs will be lost and new ones will be created in the 4thIR.

Action steps include the 4thIR commission set up in 2019, which is responsible for tackling and understanding the impact of the changing technological landscape on the economy and employment; establishing Centre for Artificial Intelligence Research, which brings together five academic institutions to develop AI and deep learning solutions; and South Africa's own Deep IndabaX on deep learning. Coding and robotics training will also be introduced to learners from age 6 and up, according to President Cyril Ramaphosa’s 2020 State of the Nation Address. The current framing focuses on economic growth, which is presumed to unlock employment opportunities.

However, the current framing does not address the societal inequalities continued and emerging in the era of AI.

South Africa exists in the context of a triple threat: inequality, poverty and unemployment. As a result, it requires a more critical approach to the 4thIR. Critics have pointed out that the current framework may risk remaining in the domain of the elite, overlooking digital inequality. Given varying gendered inequality, women may be the last to capture the benefits of the 4thIR. Replication of existing inequalities, new social injustices and unequal power dynamics will impact the differences of experiences of these new technologies. Big questions remain unanswered: How do we govern our data in a way that enables society to flourish? How do we address likely harms to occur from our differences? How do we ensure data justice is necessary for South Africa and relevant in the global conversation? After all, not all are equal in the eyes of AI.

As a Mozilla Fellow, I approach the question of data governance in a context-based manner that centres people, especially marginalised communities. I focus on data rights in terms of the adequacy of regulations meant to ensure the right to privacy and data protection from a gender perspective. I frame my assessment through a data justice approach, with an intersectional feminist lens. This approach allows one to focus on existent societal structures and identify what the issues are, who faces these issues and how they may be redressed. Socio-economic status, gender, ethnicity, and place of birth all influence how our data is treated in different contexts, and influence decisions made from that context. Loss of privacy, discrimination by gender or health, data breaches, and harms due to machine or algorithm bias form some of the injustices in societies driven by data. This approach adds on to the economic framing by focusing on people’s choice to be visible and/or not, engage with the technology and/or not, and protection from discrimination.

As part of my research project, the following questions will be guiding me:

  1. What are the emerging privacy and data protection issues of AI, from a data justice and feminist perspective?
  2. What would a gender-responsive data protection and privacy law entail? Can regional principles (e.g. SADC model law or Malabo Convention) provide guidance, or will principles drawn from global North contexts (e.g. the GDPR) carry more weight in an African context?
  3. How can civil society compel the public more to engage on data rights issues?

I will be consulting widely with different communities, with a focus on civil society engagements and policy makers. The short-term goal is providing policy recommendations and spurring more critical public engagement about data rights from a gendered lens. Collaborative action is needed, as injustices occur at the individual and collective level. In the long run, this project will encourage more research and gender-focused policy work on data governance in the African context. In the future, when our data rights are tampered with, may we see a new #DataMust campaign driven by society to shape data rights and hold those with the power accountable.