By Valarie Waswa
On September 4, 2024, the United Nations Population Fund (UNFPA) hosted a virtual talk on “Tech for Good: Feminist AI.”
This event was aligned with the International Conference on Population and Development (ICPD30) Global Dialogue on Technology and the UNFPA Think Piece titled “A Safe Digital Future.”
The talk focused on the intersection of technology and women’s sexual and reproductive health rights (SRHR). Specifically, it explored how femtech solutions can enhance personalized SRHR information and services.
Additionally, the event aimed to discuss the importance of creating a safe and inclusive digital environment for women of all diversities.
Cherie Oyier, KICTANet’s Programs Officer for Gender Digital Rights, presented the importance of data and privacy in femtech solutions.
She highlighted how femtech solutions, including chatbots, wearables, hotlines, and apps, have significantly increased access to health services for women. Solutions, such as menstrual tracking apps, empower women to take control of their bodies and make informed decisions about their reproductive health.
Challenges and Opportunities
However, we cannot turn a blind eye to the underlying data privacy challenges that come with using these femtech solutions.
Considering that the majority of data collected by these femtech SRHR innovations are very personal, sensitive and intimate, they have an even higher threshold to adhere to data protection guidelines.
This is sadly not the case with some, if not most of these femtech solutions. For instance, some of these apps have non-existent privacy policies, or if they do, they are intentionally drafted in technical jargon such that the users cannot understand what they are signing up their data to.
Furthermore, when it comes to matters giving consent to having the user’s data collected and processed by these platforms and apps, you find that the consent is conditional, in that if users fail to give consent, they may not be able to access some (or all) of the app functionalities).
Cherie also highlighted the issue of deceptive designs meant to collect more data, and this may take the nature of very long policies, the use of contrasting colours to reduce visibility, and hidden or difficult-to-find opt-out options, just to name a few.
These privacy concerns pose several risks for example sharing of the women’s data with law enforcement agencies. This may happen where for instance women who use menstrual cycle apps, whose data at times may include issues around pregnancy and abortion, get targeted by countries where abortion is illegal.
This heightened surveillance and the risk of prosecution may lead to unsafe abortions that pose even more dire risks to women’s bodies, including death.
The presentation made suggestions around building the capacity of tech developers and users on data protection principles, rights of data subjects, deceptive designs and associated risks of processing personal data.
Others were advocating for pro-feminist and pro-choice policies and laws, having stringent sanctions against platforms and apps that do not comply with data protection laws, and an individual duty of users to research and find the best privacy-preserving technologies.
Emily Springer, a leading expert in feminist AI, presented the key principles that guide the development of AI systems that are inclusive and beneficial for women.
These principles include intersectionality, holism, historical context, participatory and culturally relevant design, and a focus on liberation.
Springer emphasized the importance of individual action in shaping the future of AI. She urged individuals to actively participate in imagining, funding, building, designing, training, testing, deploying, and maintaining AI systems that effectively serve the needs of women in all their diversity and realities.
Conclusion
Looking ahead, femtech innovations have immense potential to increase and improve women and girls’ access to SRHR, but only if properly regulated to avoid propagating already existing historical injustices, biases and systemic oppression.
Ms Valarie Waswa, Legal Fellow – KICTANet