A qualitative study by HER Internet reveals how marginalized groups in Uganda are both helped and harmed by AI systems
(UGANDA | JULY 16, 2024) -- The algorithms that power social media platforms across Uganda can connect and empower the LGBTQ+ community — but at the same time, they can further marginalize and discriminate against this community, according to new research by HER Internet.
HER Internet is a feminist and digital rights organization based in Kampala, Uganda. Their investigation is part ofMozilla’s Africa Mradi research series on the impact of AI in Eastern and Southern Africa. The series raises critical questions about inequality, technology, and communities that must be confronted by the regions’ governments, media, civil society, activists, and funders.
Her Internet’s report, titled “Navigating Algorithms: The Case of Structurally Silenced Communities in Uganda,'' examines the intersection of social media platforms, users, and public policy in the context of Uganda’s LGBTQ+ community. It explores how individuals and groups use these tools to organize and support one another, through building community and amplifying advocacy campaigns. The researcher also investigates how these tools are manipulated to amplify misinformation, hate speech and biased narratives about LGBTQ+ community related to crime and mental health. This dynamic can in turn influence legislation like the recently resurfaced Anti-Homosexuality Amendment Act, the researchers note.
The report specifically focuses on four platforms: Facebook, Instagram, TikTok, and Twitter.
Says Sandra Kwikiriza, founder and executive director of HER Internet: “The internet and social media algorithms in particular can be a positive influence for Ugandans. But they are also a weapon that is wielded against the nation’s LGBTQ+ community.”
Says Juliet Nanfuka, researcher at HER Internet: “Our research shows how social media algorithms can be used to spread viral, hateful content that carves deeper channels into Ugandan popular culture, both online and off.”
The research entailed in-depth interviews and focus group discussions with participants from the districts of Gulu, Mbale, Mbarara, and Kampala. A total of 65 respondents were interviewed and identified as either female, male, transgender men, transgender women, non-binary, or non-conforming. The interviews were complemented by desk research and an analysis of social media platforms.
Key Findings:
Social media algorithms can provide community and allyship. Algorithms can facilitate the formation and growth of LGBTQ+ communities by connecting individuals with shared interests, identities, and experiences. Algorithms may prioritize LGBTQ+ advocacy campaigns and groups, making it easier for LGBTQ+ individuals to find supportive communities and resources, especially for those in areas where LGBTQ+ physical communities may be limited.
Social media algorithms can fuel harassment and disinformation. Algorithms have been exploited to target and harass LGBTQ+ individuals through hate speech, discriminatory content, and harmful stereotypes. One participant said social media platforms are no longer as safe as they used to be, and platforms appear to have become less restrictive. LGBTQ+ people can be subjected to online extortion, online harassment, doxxing, and outing. Meanwhile, respondents expressed fear about algorithms amplifying false narratives about LGBTQ+ identities, health, and rights.
Social media algorithms can reinforce gender bias and discrimination. Algorithms may inadvertently perpetuate gender bias and discrimination against LGBTQ+ individuals by reinforcing gender stereotypes, limiting exposure to diverse perspectives, or privileging content from dominant groups. Transgender people were said to be significantly affected by algorithmic bias, as they are unable to identify themselves by their gender identity online.
Platforms have limited accountability. Participants noted that it is difficult to hold big tech companies accountable for discriminatory algorithmic practices — platforms like TikTok, X, and Grindr have little or no dialogues with affected users. Participants also noted there is a wide technical divide between the creators of the platforms and their users in Uganda: Many users have limited digital literacy and familiarity with topics like algorithms and AI systems.
Platforms, funders, and LGBTQ individuals can all improve this ecosystem. Platforms need to provide access to data that can enable further research. They can also improve their content moderation practices to meet the concerns of the LGBTQ+
communities in African countries. The LGBTQ+ community needs to continue building the evidence required to push for greater platform accountability. And funders need to examine how platforms are reshaping the lives and practices of LGBTQ+ users in restrictive countries like Uganda.