Menue_phone

650,000 ISIS accounts deleted: How social media tackle fake and hate

by Jane Whyatt and Ana Ribeiro

Leaked documents published in The Guardian (UK) have revealed the rules that Facebook uses for deciding whether to block or take down fake news and hate speech. Simultaneously, the heat is being turned up on social media regulation in general.

Media freedom campaigners welcome the leak as a move towards greater transparency.

Aidan White, Director of the Ethical Journalism Network (EJN), comments that the internet giant has grown "eye-wateringly rich" by selling Facebook users' personal details. White says the business model relies on generating the maximum number of likes and clicks, and the success of this and other social networks has damaged traditional journalism.

The Guardian itself announced in March 2016 that 250 journalists' jobs were being cut at the newspaper and its sister publication, The Observer.

Facebook's response to the leak: spokesperson Monika Bickert has published a column in The Guardian in which she says:

We believe the benefits of sharing far outweigh the risks."

She promises to recruit more staff to keep up with the thousands of posts which may need to be moderated following complaints from other users. The social network also aims to provide a technological solution through artificial intelligence.

However, the limitations of its current system are clear to anyone who reads the leaked guidelines. This was illustrated in 2016, when Facebook took down an iconic photograph from the Vietnam war: a naked little girl, Phan Thi Kim Phuc, with her back on fire following a napalm strike by the US Air Force on her village.

Aftenposten und Facebook Screenshot from a New York Times article on Facebook's removal of the iconic Vietnam War photo, which failed to pass the platform's nudity filters, from Norwegian newspaper Aftenposten's timeline. Facebook often makes the news these days. Pictured: Espen Egil Hansen, editor in chief of Aftenposten.

This Pulitzer Prize-winning picture shocked the American audience and is believed to have been a turning point in changing public support for the war. As such, it has great historical significance.

But as an horrific image of a naked child, it failed the Facebook tests for suitability - which did not take into account its place in history - when the Norwegian daily Aftenposten re-published it on its Facebook page. This made news internationally; Facebook later restored the photo.

Cultural nuances such as these make it impossible for automated solutions to make appropriate decisions every time. And Facebook apparently does not hire enough humans to cope with the task.

EJN's Tom Law told the ECPMF:

They [Facebook] just don't have enough moderators. It is a very difficult job and we have heard that there is a high turnover."

The EJN points out that even if Facebook keeps its promise to hire an additional 3,000 moderators on top of the existing team of 4,000, that will still only provide one moderator for each 250,000 users. Moderators have to cope with horrific images, clever fakes, posts filled with abuse, pornography and images of children being abused.

This makes it a very stressful job and very time-consuming, if the moderators are to make the correct judgment in each case. The pressure on them is increasing, notably in Germany, where there is a law governing hate speech (in most cases illegal, in light of the country's Nazi past).

Proposed law in Germany

The cabinet of German Chancellor Angela Merkel recently approved a bill named Netzwerkdurchsetzungsgesetz (NetzDG for short). It is primarily aimed at Facebook, a target of German Justice Minister Heiko Maas over the past months for its perceived failure to delete hate speech and fake news efficiently enough.

Germany Facebook 900X600 Germany's crackdown on cyberhate largely targets far-reaching Facebook, which has landed in hot water for allegedly allowing illegal hate speech on its platform. Photo: public domain

The law would subject Facebook and other social networks to up to €50 million in fines for failure to curb the two scourges within a time frame between 24 hours and seven days. It is set to go to the German Parliament this summer, and has gotten criticism from civil rights and journalists' organisations.

Some have argued that the time frame is too short, and that decisions on what is allowed to be posted on social media should not be left up to corporations. There are also concerns that this would amount to unbridled censorship.

Pro-ISIS tweets

The Twitter social network is now being brought into the spotlight for potentially dangerous posts, as well.

Twitter's Nick Pickles told the ECPMF: "We have taken down 650,000 pro-ISIS accounts," as an attempt to rid the network of dangerous users. But when questioned about the fake tweets claiming people had lost relatives after the Manchester suicide bomb, he admitted that it is harder to spot suspect tweets in these circumstances.

One tweet from a controversial columnist, Katie Hopkins, was reported to the police. In it, she called for "the final solution" as an answer to terrorism. This was an obvious reference to the Nazi Holocaust, and drew widespread condemnation in English-language media.

Tom Law Tom Law, EJN. (Photo: ECPMF)

Nick Pickles encourages all users to pass on to the police any tweets that may be illegal and to check, report to Twitter, block or mute suspect or offending posts. He pointed out that Twitter is working with the European Commission's monitoring project, and publishes transparency reports twice a year. These show that many governments make requests for Twitter accounts to be blocked or posts removed: Turkey heads the list in 2017.

The Guardian leak of Facebook's internal codes has opened a dialogue between established and social media. And this can only e a good thing, according to Tom Law from the EJN: "For years we journalists have been trying to learn and use social media. And now it seems the social media must also learn to behave like journalists."


More

The briefings to the ECPMf from the EJN and Twitter were delivered as part of the 2017 #mediaagainsthate workshop delivered by the Media Diversity Initiative, with support from the European Federation of Journalists (EFJ) and the European Commission. More details soon - watch this space.





Get in Contact

fact finding mission analysis