Click on each tab to view more information about the individual projects

“I’d like to know more about data collection and progressing policies that we consent to as we share information online and at the same time identify how gender issues can be factored in these. I would really like these apps to word their data terms and conditions in a user friendly way. App owners and developers make   people consent to the use of their technology without making them read the fine print. They need to better their standards to safeguard their users” – Participant in research project


By Chenai Chair and Tinatswe Mhaka


Exisiting in a datafied society means, whether one is connected to the internet or not, the different social actions that people are involved in, within their respective communities, are data points that can be easily monitored, tracked and optimised.  The complex nature of datafication and the few choices people have on how their data is collected and processed can be very frustrating to individuals, leaving some feeling defeated and dejected. These concerns with data are more pronounced at intersectional points of inequality, for example, the gender one identifies with, the class one belongs to, educational background, access to resources and sexuality.


We carried out research  to respond to the current discourse on data by collaborating with  African feminists to imagine resistance to data.  We chose to focus on this community of people that engage with forms of gendered inequality each day – an African feminist community formed of women, LGBTQIA people and non-binary people.


How does one build resistance to datafication with those already at the front lines of fighting gender inequality?


The research took an inquisitive and flexible approach in the hope of understanding the extent to which African feminists in the region engage on issues to do with gender, privacy and data. We spoke to several feminists from Malawi, South Africa, Zambia and Zimbabwe to document their experiences with data and explore ways of resistance to violation of their digital rights.


Through this research, we sought answers to the following questions:

1) What are the multiple and intersecting understandings of the relationship between gender, privacy and data by African feminists?

2) How do African feminists navigate a datafied society?

3) What are feminist centered ways of resistance in datafied societies?


At the end, the imagining resistance begins with collective action that is based on using the power of the network to build social movements, as well as participating in the shaping of the public space online and the policies that govern the internet – (Feminist Priniciples of the Internet on Movements)


Download and read the research paper here:

Chair, C and Mhaka, T. (2022) “Where my data at?”: Imagining resistance to data processing with African feminists. My Data Rights.


Listen to the podcast in collaboration with at the Feminist Bar:





The project is supported by the ADAPT project from Internews.

The existent injustices that marginalised groups and sexual minorities face shape how they experience technology, policy, and regulation – either to achieve social justice or increase current inequalities and harms. In the era of AI based innovations such as automated decision making and algorithms this context is important to understand how we can safeguard against harms and injustices while using these technologies to dismantle barriers of oppression. The development, implementation and governance of AI based innovations, especially around data that is fed into them, raises important questions on their impact on society and digital rights.

The key policy questions asked where:
  1. What would a gender-responsive data protection and privacy law entail to ensure gender safeguards against AI gendered harms?
  2. How can civil society play a role in ensuring a gender transformative law and practice with a focus on the right to privacy and data protection?

The major highlights from the research are:

AI discourse in South Africa
Policy conversations on governing AI have begun in the country with a focus on ensuring economic gains through the creation of employment opportunities by upskilling citizens; being innovative enough to attract global business as well as legal and regulatory compliance for global trade. However concerns have been raised that there is a risk of perpetuating digital disparities and inequality in the discourse around AI.
Uneasy Access – Gendered concerns of AI
Issues emerging from a gender perspective include the lack of agency and control over data, problematising consent in contexts of unequal power dynamic, loss of privacy, discrimination and bias at the intersection of race, class and gender.
Perceived concerns of AI harms
The research conducted finds that there is in general a lack of documentation on harms in the South African context. However through interviews and open ended survey questions, important contextual issues that were recurring references were race, economic status, homophobia and gender-based violence.
Responsiveness of current Protection of Personal Information Act to AI and gender issues
A gendered reading of the Protection of Personal Information Act (2013) found a need for the current law to be contextually responsive, address the issue of gender exclusionary language and the lack of nuance of gender and sexuality harms embedded into the law. There were examples of civil society making use of the law to challenge surveillance issues however a gendered nuance is missing. To be gender responsive means to design and implement policies that consider gendered realities of the society we live in and ensure that injustices are not replicated as we race towards digital development.

The recommendations stemming from this paper cover four important areas:

1) Policy and regulation:
Policy and regulation would require context-based implementation and assessment of current and future privacy and data protection laws with regards to Artificial Intelligence innovations. This would be complemented by collaboration between Civil Society, the legal community and other relevant stakeholders.
2) Research and documentation:
Research and documentation is necessary to fill the knowledge gap in understanding the context based impact of AI based innovations. Case studies documenting the impact of AI on marginalised groups would highlight the necessary responsive means to safeguarding against harms and injustices. Research may also be used to support development of governance models and develop AI registrars to document the proliferation of AI, where it is used and the impact it has. Resourcing these initiatives to ensure their public and open resources is important.
3) Public awareness:
Public awareness requires collaborative, relatable, and innovative campaigns to raise understanding of the opportunities and challenges of AI with regards to privacy and data protection. Public awareness would focus on campaigns for diverse marginalised groups, creation of collaborative spaces that are safe for gender and sexual minorities to learn and raise their concerns and resourcing from different stakeholders to ensure the necessary support needed for public participation
4) Responsibility of technical community:
The technical community carries the responsibility of carrying out public engagement and sharing information on how their systems work to ensure accountability and trust of AI based innovations. Civil society can be drawn into ethical guideline development that is cognisant of experiences of injustice for marginalised groups. In design and implementation of AI based solutions – digital literacy, privacy by design and context responsiveness should form underlying guiding principles in designing solutions that would not lead to further social injustices