By Cherie Oyier
The 2024 Digital Rights and Inclusion Forum (DRIF) addressed a critical issue: gendered disinformation. This harmful tactic silences women online and restricts their digital rights.
KICTANet’s Cherie Oyier joined a panel discussion to share insights into the different forms of information disorder (disinformation, misinformation, and malformation) and how they are used to silence women.
The panel was also composed of Angel Minayo (Article 19), Muthuri Kathure (Mozilla), and Miriam Beatrice (Paradigm Initiative).
Further, they explored the concept of “gendered disinformation” and its effects, including self-censorship and online gender-based violence (OGBV).
Information Disorder Decoded
Information disorder is the term used to describe disinformation, misinformation, and malinformation. While disinformation or “fake news” is often used to describe the different forms of disinformation in everyday conversations, these forms of information disorder are nuanced and distinct.
Disinformation refers to false content shared with malicious intent to cause harm.
Misinformation refers to false content shared without malicious intent to cause harm.
Lastly, misinformation refers to genuine content shared with malicious intent to cause harm. For instance, the non-consensual sharing of intimate images falls under this category.
It is imperative to acknowledge that during coordinated online attacks, all three forms of information disorder may be adopted at the same time, making the distinction even more challenging for those observing.
Weaponising Disinformation on Gender Basis
During the panel discussion, Cherie discussed how disinformation is weaponised to silence women online, hence limiting their inclusion and digital rights.
Disinformation is described as gendered, based on the target’s gender or gender expression. Gendered disinformation advances and reinforces gender stereotypes and misogynistic, transphobic, and homophobic narratives.
Online conversations that perpetuate gendered disinformation often deviate from discussing the substance, content, or ideas held by the targeted women.
Instead, perpetrators go on a rampage to attack their targets’ morality, intellect, competence, or integrity on the basis that women are too emotional, untrustworthy, or hypersexual.
The result of such attacks is that often, women are rendered too afraid to continue using the online platforms where such attacks were meted on them, hence self-censoring. It has been observed that women in the public space, such as politicians, human rights defenders, and journalists, are disproportionately affected by gendered disinformation.
However, a new trend has also been observed: women simply voicing their opinions online are increasingly targeted.
The intersection of Gendered Disinformation and OGBV
Gendered disinformation can ignite and fuel different forms of OGBV offences, including body-shaming, cyberbullying and trolling, among others.
Think about a fake nude photo of a female politician shared to tarnish their image and the conversations it would elicit. Most likely, the recipients of the image online would not verify its authenticity and would immediately resort to trolling and body-shaming the woman. This is the reality of many women online.
Aside from self-censorship, such women may suffer from psychological trauma, as indicated in KICTANet’s latest research on OGBV.
RELATED
- Anthology of stories – “Narratives of Strength and Survival” aimed at documenting the lived experiences of Kenyan women and girls who have faced online gender-based violence at a personal level.
Hurdles on the Road to Curbing Gendered Disinformation
As technology continues to evolve and bring us new benefits, it also presents new challenges. Gendered disinformation is no exception to this yin-yang phenomenon.
Disinformation often carries a certain shock value that evokes curiosity among the masses. With platforms such as X incentivising users based on the number of engagements, more outrageous content is being seen on such platforms. Due to the shock value and curiosity evoked, algorithms also further push content that perpetuates gendered disinformation.
Malicious people no longer have to perpetuate gendered disinformation on their own. Instead, they can use bots for a coordinated attack and to spread gendered disinformation to a wider audience within the shortest time possible.
Panellists decried the fact that countries in the global south also still experience challenges, such as slow action by platform owners when reports of gendered disinformation are made.
Further, using diverse local languages on these platforms to perpetuate gendered disinformation remains a challenge because platform owners still need to prioritise putting adequate linguistic experts and content moderators in place to flag these offences.
Way Forward
Gendered disinformation is a complex topic that requires a multi-sectoral approach to curb. During the panel discussions, the panellists agreed that online platforms must prioritise content moderation in the diverse languages used in the global south.
Prioritisation entails investing and setting money aside to hire more content moderators and linguistic experts in these regions, train, and ensure that reporting mechanisms are clear and transparent.
Organisations working on women’s digital rights should continue to advocate for the eradication of gendered disinformation to ensure women’s full participation and inclusion in these spaces. Hence, training women on digital resilience is imperative.
Further, due to the psychological aspect that accompanies gendered disinformation, training should include psychosocial support sessions.
In closing, Cherie also highlighted the work KICTANet has been doing to advocate for legislative reforms to include the different forms of OGBV offences in the existing ICT laws.
KICTANet is also launching an OGBV tracker to map OGBV incidents across Africa and use the findings for research and policy advocacy.
Cherie Oyier is KICTANet’s Programs Officer for the Women’s Digital Rights Program.