An ethnographic research into the datasets powering maternal healthcare app “DawaMom” used across Zambia and other Southern African countries
(LUSAKA, ZAMBIA | MONDAY, NOVEMBER 18, 2024)— Emerging technologies like artificial intelligence (AI) are expected to strengthen access to quality healthcare, offering new opportunities to improve maternal health. In Zambia, where a newborn dies every 30 minutes and a stillbirth occurs every hour in certain regions, AI-driven solutions are perceived as a crucial lifeline in reducing maternal and child mortality rates.
Research published today by Min’enhle Ncube from the University of Cape Town, notes that as maternal health AI solutions emerge, they risk replicating biases due to datasets that do not reflect culturally relevant practices and knowledge. Her study, “Incomplete Chronicles: Unveiling Data Bias in Maternal Health” focuses on one such AI solution, the DawaMom app developed by Lusaka-based startup Dawa Health.
The DawaMom app provides personalized healthcare guidance to expectant mothers. Dawa Health clinicians conduct home visits to build this dataset, gathering information on demographics, medical history, and key health indicators. AI then processes this data, identifying patient risks and enabling early interventions for high-risk conditions.
Ncube’s research is supported by Mozilla as part of the Africa Mradi grants supporting interdisciplinary researchers investigating the impact of AI on communities in Eastern and Southern African regions.
In her study, Ncube reviewed samples from national institutional archives and assessed historical and current maternal data collection practices in three countries: Zambia, South Africa and Zimbabwe. Additionally, she analyzed data collection processes used by the DawaMom app, conducting interviews with healthcare providers and mothers (including some DawaMom users) to understand how these processes impact the creation and sharing of maternal healthcare knowledge.
The report underscores that, while promising, AI can inadvertently replicate biases present in the data it relies on. In the case of Zambia, the lack of resources to collect, digitize, and clean data leads to gaps in representation. To supplement the low volume of available data sets, startups like Dawa Health use open-source platforms like Kaggle, an online community that helps find and publish datasets. While Kaggle offers freely accessible datasets, it is limited in its representation, lacking relevant context to the Zambian population. Min’enhle Ncube, researcher and author of the report, says: “For AI to truly improve healthcare in underserved communities, it must reflect local realities. That means incorporating diverse datasets and traditional practices.”
For AI to truly improve healthcare in underserved communities, it must reflect local realities. That means incorporating diverse datasets and traditional practices.
Min’enhle Ncube, Researcher
The study also highlights how the absence of indigenous and traditional maternal health practices in these datasets risks sidelining the cultural knowledge crucial to many rural and underserved communities.
Despite its strong biomedical underpinning, the DawaMom app may unintentionally homogenize maternal healthcare, focusing too narrowly on standardized biomedical practices and data collected primarily in the Global North. “This reflects broader challenges in data collection, where archivists and institutions often decide what data to preserve based on historical, cultural, and evidentiary value—decisions that may reflect certain biases or perspectives,” Ncube states.
Key Findings and Recommendations
The gender-digital divide impoverishes digital healthcare access. Many women in rural areas face barriers in accessing and using the DawaMom app due to low literacy levels, limited access to mobile phones, and poor internet connectivity. Features such as voice-based navigation could make the app more user-friendly.
AI bias risks widening healthcare gaps. Prevailing approaches to defining health data are predominantly Western-centric, heavily biased toward biomedical metrics, and often neglect the expansive insights offered by marginalized knowledge systems. While AI tools like DawaMom have the potential to revolutionize maternal healthcare, they also risk reinforcing existing biases. The report found that DawaMom’s datasets have little consideration for local, traditional maternal care practices that are common in Zambia.
Develop context-specific AI definitions and Algorithms: Defining ‘artificial intelligence’ in a way that recognizes local differences will ensure the technology is adapted to Zambia’s unique maternal healthcare needs, rather than applying a one-size-fits-all approach. AI systems like DawaMom should be tailored to the local context by using algorithms that reflect the specific datasets and cultural nuances of the regions where data is collected.
Continuous collection of more representative, inclusive, and ethical AI datasets. AI systems like DawaMom must adopt ongoing and collaborative data collection strategies that incorporate more diverse and representative datasets, including traditional remedies, to ensure the DawaMom app serves the full spectrum of maternal healthcare needs in Zambia. This includes collaborating with diverse and underrepresented communities, healthcare providers, and traditional healers to ensure the AI systems align with local practices and values to create more holistic maternal care solutions for women in Zambia. This will also reduce the risk of AI systems relying on standardized, external datasets that do not reflect Zambia's cultural and healthcare realities.
Stronger data legislation and regulation. To ensure AI systems like DawaMom are built and deployed in ways that safeguard the rights and health of women. Regulatory bodies such as the Zambia Information and Communications Technology Authority (ZICTA) should model frameworks on data practices from a grassroots approach, considering voices from marginal communities.
Press Contact: Tracy Kariuki: [email protected]