Then it looks like you have everything under control. 🙂
Hope the problems gets solved soonest for the sake of us all.Â
Best wishes!Â
On Tuesday, November 20, 2018, 8:30:08 PM GMT+3, Ebele Okobi <[email protected]> wrote:
The problem set is a platform with 2.4 billion people, posting billions of pieces of content. We have some proactive review, of things like child exploitation content, but it is really difficult to pre-review every single piece of content at this scale. It’s also not even what a majority of users want. I know that I personally, use FB a great deal, and I would not want that.
In order to ensure that *no* bad content is posted, that is, actually, what would be necessary. This is actually an issue we have spent a great deal of time thinking about, with multiple experts, and while there are more ways we are identifying content, a state of no bad content ever posted, or a system that would enable FB itself to be aware of every single piece of content-this is a genuine question-how do you think that would work?
Saying “private tech platform” doesn’t answer the question-detecting crime is still detecting crime/bad actors, and it’s still not clear from this phrase how you think this would work, irl.
We pay multiple consultants and confer with thousands of rights activists, safety advocates, CSOs, law enforcement, etc, and it’s not apparent from this exchange that you are an expert in any of the topics discussed, so the notion that my genuine interest in your opinion amounts to a request for free consulting is odd.Â
On Nov 20, 2018, at 5:16 PM, Patrick A. M. Maina <[email protected]> wrote:
Sawa, in-line answers below…
On Tuesday, November 20, 2018, 7:52:11 PM GMT+3, Ebele Okobi <[email protected]> wrote:
You haven’t answered any of my questions.Â
I have repasted, for reference-
Would it be preferable to have a platform where every single post, picture, comment is subject to pre-clearance, by Facebook?A: Are those the only options? Is that the only alternative to status-quo? What happened to open innovation? Ideas for better *technical* solutions exist, but the good ones are not free.Â
How do you think policing works in society?A: Policing in FB is not analogous to policing in society. FB is a private tech platform.Â
Are there police assigned to each individual, actively monitoring each?A: pls. see above answer.
How do you think actual communities work?A: pls. see above answer.
If not for community, exactly how should a platform of 2.4 billion people posting billions of pieces of content per hour, the vast majority of which is completely innocuous, work, in your view?A: I could tell you but that would be free consulting. :-).Â
What is an “educating” model?A: Again, I can’t do free consulting to billion dollar companies.Â
FB could try an “innovation competition” to get a cheap/free brainstorm on the issue but I think people globally are wising up on the odds around such events and so the quality of ideas is going down. Probably another area that needs new thinking. Â
Enjoy your evening! :-)Patrick.
On Nov 20, 2018, at 4:45 PM, Patrick A. M. Maina <[email protected]> wrote:
On the “educating model”, I can do some ad-hoc paid consulting for you guys if you haven’t thought of it. Lets discuss offline if interested.
I don’t understand the society argument… Facebook is not “society”. It is a *for-profit business entity* founded on what looks like a predatory business model which exploits human/society’s weakness (e.g. narcissism, personal insecurities, reward mechanisms in the brain etc) for monetized data and engagement. Â
There’s an interesting pattern that I hadn’t originally picked on… It looks like mega corporations embrace pseudo-communism ideals to avoid owning problems that *they* created/exacerbated:Â “Beloved users, the problem belongs to all of us, because we need each other as a community. So each one of you should give mega-corp a free lunch because its good for you”.. but when it comes toprofits, they revert to pure capitalism “our profits belong to shareholders only. we are capitalists. no free lunches!”.Â
Take ownership.
Opinion today: Facebook’s excessive focus on profits
|
|
|
| | |
|
|
|
| |
Opinion today: Facebook’s excessive focus on profits
Regulation looms for social media — much as big banks after the financial crisis
|
|
|
On Tuesday, November 20, 2018, 6:53:35 PM GMT+3, Ebele Okobi <[email protected]> wrote:
What is an “educating” model?
How do you think policing works in society?Are there police assigned to each individual, actively monitoring each?How do you think actual communities work?
If not for community, exactly how should a platform of 2.4 billion people posting billions of pieces of content per hour, the vast majority of which is completely innocuous, work, in your view?
On Nov 20, 2018, at 3:43 PM, Patrick A. M. Maina <[email protected]> wrote:
It kinda looks a bit like an *nudged* duty.. 🙂 otherwise, why would low levels of reporting be an issue if the system does not heavily *rely* on community reporting (yay! free labor!)? Doesn’t the community have its own engagements to focus on and should facebook not respect that?
Are you sure that FB truly supports free expression or is it not that its just cheaper (more profitable) to offload policing to the community?
If the FB platform truly supported “free expression” things would be more complicated because instead of takedowns, you would use an *educating model* which is much harder to pull off.Â
Sidenotes:a. I’m curious how FB defines “free expression”.b. On the Child marriage, FB acted after it was too late (girl had been sold off). This suggests heavy reliance on community policing. Is this a form of wilful negligence on the platform part because by now FB is aware that it is being misused for anti-social purposes?
Good evening.Patrick.
“We continue to evolve our ability to detect violations on our platforms, but YES, it is YOU, the community, who helps to police the content on Facebook.Â
I continue to be struck by the incredibly low levels of reporting across our Continent. We continue to develop educational materials, but I am always surprised at how few people, even in circles like this, know to report bad content into the platform.”
On Tuesday, November 20, 2018, 6:14:11 PM GMT+3, Ebele Okobi <[email protected]> wrote:
At no point did I say that the community has a duty to report or that the community is to blame. Facebook responds to the community when the report. This gives the community the power to let us know when something is wrong. Why would anyone *not* want the ability to report?
Â
 Would it be preferable to have a platform where every single post, picture, comment is subject to pre-clearance, by Facebook? I find it odd that anyone interested in free expression would want such a model.
Â
From: kictanet <[email protected]> on behalf of “Patrick A. M. Maina via kictanet” <[email protected]>
Reply-To: “Patrick A. M. Maina” <[email protected]>, KICTAnet ICT Policy Discussions <[email protected]>
Date: Tuesday, November 20, 2018 at 2:44 PM
To: Ebele Okobi <[email protected]>
Cc: “Patrick A. M. Maina” <[email protected]>
Subject: Re: [kictanet] [should the victims be blamed? aren’t platforms responsible as enablers and amplifiers?] Child marriage on facebook
Â
Some responses on this topic raise some interesting and important issues:
Â
1. Do social media/messaging platform play a role in crime as amplifiers, enablers?
Â
2. Would crimes be harder to pull off if such platform could, through enhanced technical functionality (which might not necessarily be profitable), not be easily used for organized criminal purpose?Â
Â
3. Does the community owe the platform a duty to report (as alluded here, such that the community can be blamed for platform misuse)? How much blame does the community share?
Â
4. If indeed the community has a duty to help FB police its platform, will FB also share its revenues with the community seeing as they are its informal “employees” as well? Or are they only buddies in bad times but strangers in good times?
Â
5. Do (or should) victims of social media enabled harm (including, say, businesses that lose sales due to chaos or governments whose economies are effectively sabotaged) have recourse against the platform owner? To what extent? Who else should own the problem and why?
Â
I think the “deflect blame to the victims” script is unwise and could backfire. It would probably cause an uproar if used in more assertive parts of the world (i.e. in developed countries/regions).Â
Â
Good day listers,
Patrick.
Â
Â
On Tuesday, November 20, 2018, 3:52:31 PM GMT+3, Wainaina Mungai via kictanet <[email protected]> wrote:
Â
Â
Hi,
Â
Facebook as increased their staff significantly to help police what is posted. We may want not to blame the medium used and focus more on addressing the culture of marrying off children of any gender in any country. That way, we remain focussed on ‘children’s rights’.
Â
The main offenders in this case are the “sellers” and “buyers” who took part in the auction.
Â
In the end, the extent of regulation will depend on mutistakeholder negotiations on the balance between an open Internet for all and the need to protect privacy, security and human rights online.Â
Â
Wainaina
Â
On 20 Nov 2018 15:18, evelyne wanjiku via kictanet <[email protected]> wrote:
Hi listers,
Â
Im following a debate on cnn about this south sudanese ‘baby bride’ who was auctioned on fb.Â
Â
It brings me back to this question, who should regulate facebook? Some argue fb is too big to regulate all the things that happen on their platform.Â
Â
Who should police fb? Is it us? We have power to shut down our pages if we dont agree with what goes on in their…but we don’t. Why?
Â
Is it facebook? Do they care about being responsible especially in Africa?
Â
Is it government? And just how far can the government reach?Â
Â
Or should we just relax and face the beginning of the end by having an attitude of anything goes as long we have internet.Â
Â
Nice day everyone.Â
Â
Sent from Yahoo Mail on Android
Â
_______________________________________________
kictanet mailing list