A summary of discussions from the KICTANet platform
A month ago, a post promoting the bid of a Sudanese girl of 17 years was circulated on Facebook. She was later married off to a man three times her age, becoming the ninth wife. Whichever way we try to put it, this is a clear violation of a child’s right. It violates Facebook’s policies, as well as Sudan’s laws. Most important is that the rights of a vulnerable member of a society are violated in a medium owned by a private company, and witnessed by thousands of users. Allegedly, individuals representing organizations tried to report this issue to Facebook but the company did nothing until it was too late. This incident stirred a serious debate on KICTANet mailing list around the regulation of content on Facebook.
Community reporting
Facebook relies heavily on proactive community reporting for content regulation. The platform has over 2.4 billion users, sharing 2.4 billion posts per hour. Regulating content manually has obviously been a big challenge because of the workload. Nonetheless, this kind of end user reporting has helped the company judge content violations on different cultural contexts. But it is still not enough. Facebook has been criticized for favoring certain sections of the communities when arbitrating cases of content violation. Head of Public Policy in Africa also notes that Africa is yet to take advantage of community reporting tool:
I continue to be struck by the incredibly low levels of reporting across our Continent. We continue to develop educational materials, but I am always surprised at how few people, even in circles like this, know to report bad content into the platform.
Ebele Okobi.
This comment raised heated debate on the responsibility of the community in reporting incidences.
Does the community owe the platform a duty to report? How much blame does the community share? If indeed the community has a duty to help FB police its platform, will FB also share its revenues with the community seeing as they are its informal “employees” as well? Or are they only buddies in bad times but strangers in good times?
Peter Maina.
First, the nature of content distribution on Facebook has enclosed users into information bubble such that chances of a user finding violating content within their networks would be very rare –they share the same opinions, interests, and biases.
Second, societies that have social order, are those whose norms are inherently instilled in its members as values. This means that Facebook has to continuously reach out to its users to influence change of norms and attitudes, especially because FB users are heterogeneous. There are wide variations in what is accepted and what is not, between different communities.
Third, the government bears the responsibility of maintaining law and order, and protecting citizen rights through policing as one of its tools. Police can only do as much, because we cannot have them everywhere, intruding into people’s activities. So, it relies on the society’s reaction towards deviant behavior, to prevent and report crime. However, the responsibility to prevent crime still lies with the government. The same way Facebook will be expected to regulate criminal behavior on its platform.
Recently, Facebook reported that it has deployed Artificial Intelligence (AI) technology to help identify harmful content. The machines have been massively trained to identify a wide array of content violation incidents.
Government regulation
Another point of focus in the debate was the ability of the government to regulate the social networking platforms. Facebook is like a public park or a public square where people convene. The urge for users to share information is motivated by the need to build relationships. Other users -marketers, advocacy groups- come here to share information that influence action, perceptions and behaviors. In the same digital spaces, users have coalesced to break laws. Those who drink and drive form Facebook and WhatsApp pages that share the locations of traffic police. The police have also taken advantage of community policing, promoted by these spaces to hunt criminals. The language and contexts used in these spaces makes it difficult for Facebook to regulate content to this extent. In effect, the government’s role in regulating content will be publicly appreciated. However, Hannington Oduor wonders whether the market and the government are well equipped for the tasks:
Is the government tech savvy enough to regulate the big ISPs, Facebook, Google etc? Is the market equipped with the right tools to deal with the concern?
Hannington Oduor
But even if all stakeholders had the tools, what would a more proactive government regulation of facebook look like? Will Facebook allow governments to regulate these spaces, and how will multiple governments regulate content on this single, ubiquitous, social utility?
KICTANet is an open platform for organization and individuals interested in discussing ICT policy matters. The debate is still open and we would like to hear your perspective on this issue. https://www.kictanet.or.ke/?p=37931