Facebook's plan to kill dangerous fake news is ambitious – and perhaps impossible. - Just news updated

Latest News Today - Find latest News in English and Latest News Headlines based on current affairs, sports, politics, entertainment, automobile, technologies.

Friday, July 20, 2018

Facebook's plan to kill dangerous fake news is ambitious – and perhaps impossible.

Facebook's plan to kill dangerous fake news is ambitious – and perhaps impossible.

New policy to tackle content that could fuel violence may be well-meaning, but the complexity of the task is mind-boggling.

 
Facebook faces a particular challenge in WhatsApp, on which dangerous rumours are spread through encrypted messages. Photograph: Christophe Morin/IP3/Getty Images

Facebook has been grappling with its role in spreading false news and disinformation for a few years, but a spate of mob violence in India, Sri Lanka and Myanmar have spurred the social network into a knee-jerk policy change.

Until now, Facebook has dealt with disinformation by making it less prominent in people’s news feeds. This week, the company announced it would start to delete inaccurate or misleading information created or shared “with the purpose of contributing to or exacerbating violence or physical harm”.

On the face of it, it seems like a reasonable and well-intentioned policy. However, the lightest interrogation reveals a mind-bogglingly complex and thankless task.


Mark Zuckerberg's remarks on Holocaust denial 'irresponsible'

In addition, any successes will be undermined by the fact that much of the inflammatory misinformation in South Asia is being spread through Facebook’s sister platform WhatsApp, where encryption makes content moderation impossible.

The policy change will first be implemented in Sri Lanka, where pernicious falsehoods on the platform, such as the allegation that Muslims were putting sterilisation pills into food intended for the country’s Sinhalese majority, have stoked riots, beatings and the destruction of mosques and Muslim-owned businesses. The Sri Lankan government temporarily blocked Facebook services in March in an effort to defuse the situation.

Facebook said it was working with local civil society groups to identify which content might contribute to physical harm. Once the company has verified that information is false and could be a contributing factor to “imminent” violence or harm to physical safety, Facebook will take it down.

Last month, the company said, it removed content that falsely claimed Muslims were poisoning food. The company would not reveal the exact content it had removed, nor the names of the civil society groups it was working with. A representative from the Centre for Policy Alternatives, one of the more vocal civil society groups in Sri Lanka, said: “This is not something we were told about.”

Even if Facebook cracks misinformation on its main platform, it has a trickier problem on its hands with WhatsApp

The policy announcement appears to have been rushed out to provide some “news” for dozens of non-US journalists whom Facebook had flown in from Europe, Asia and Latin America for a day-long media event at the company’s Menlo Park headquarters on Wednesday.


This post is from https://www.theguardian.com

No comments:

Post a Comment

PropellerAds