In a conference room with bländvitt fluorescent lighting and bare concrete walls at Facebook’s stockholm office, sitting Siobhan Cummiskey. She is content editor at the company. It means that she determines which posts should appear on the site, and to what extent, but also who should be removed if they are considered to contain hate speech, harassment or offensive images.
It is the closest thing to an editor from a platform that constantly avsvurit all utgivaransvar for what ends up on the site.
people on Facebook with a review of the millions of posts each week. Just between July and september last year, which took away three million posts with hate speech, 750 million fake accounts, nine million posts with sexual or nudity content on children. Even pro-terrorist propaganda is common: between april and June took it away 9.4 million posts containing it.
With a combination of automated technology and human hands trying in advance to find the content before it appears on the site.
– 99% of terroristinnehållet are looked up by the AI and removed before anyone had time to report it to us. The automation is less efficient on occasions that require more knowledge about the context they are in, as hate speech and harassment, where the figures are around 52 per cent before they have been notified, ” says Cummiskey.
Peter Wolodarski: Facebook hovering above the laws and ethics
Krigsfotot was an interesting moment for our policy which says that you do not get to put up nude pictures of children.
they act quickly enough. In Sweden was a man’s live broadcast on Facebook up in ten hours, where he told how murdered their two children and would take his life. And in Myanmar could the armed forces transform the social network into a tool for ethnic cleansing, where the posted propaganda incited to murder, rape, and the largest forced evictions in the modern era, which the New York Times reported.
” We learned a lot from the experience. We were not quick enough to act on the content on our platform. But a few months ago it was only 13 per cent of the content in Myanmar, which was removed in advance, while the figure is now 67 per cent for hate speech there, so we have made progress.
– People reported not so much of the content in Myanmar, which sometimes takes place in countries where, for various reasons, there is a culture of not notifying content. We have also hired more burmesisktalande people on granskningsavdelningen and improved how notifications take place because we saw that some reports were not reviewed properly. So we have made several improvements, but realize we should have acted faster.
Also, cultural content has led to remarkable assessments on Facebook. Among other works by the artist Anders Zorn removed two times: a picture of ”the Girl in the loft” was taken down in 2011, and by 2015 the picked images away from an upcoming exhibition at the Photographic, ”Zorn and the camera”. In both cases, reference was made to the policy on nudity.
” I do not know about the case. But we employ people from the local countries so that they can recognize cultural expressions, slangspråk, and the political context of what has been published in order to interpret it and take the right decision.
also Read: Exclusive: censored on Facebook
”Transparency is always the first step towards improvement,” said Siobhan Cummiskey. Photo: Hampus Andersson
Facebook establishes applies to all countries where the platform is, ” she says. But how do you put a global standard, given the cultural differences between countries?
” We understand that a policy can’t make everyone happy, so we try to be fair and clear. In some countries, is nudity acceptable, while in some parts of the world is a photo of a woman in a bikini is not viable.
– There are also parts of the world where you, if you are a woman and linked to a platform that contains nudity, can end up in a dangerous situation. It can mean that women are afraid to appear on the platform and we want our platform to be open to all. We collaborate with international experts in areas such as terrorism and child safety, so that we do not, we put limits to their own taste – but with the help of them.
the case is the world-famous picture from the Vietnam war where the nine-year-old girl Phan Thi Kim Phuc running naked and napalmskadad through a village after the american bombräder. Also the picture was taken down with reference to the nakenhetspolicyn, which felt clumsy and kontextlöst.
– Krigsfotot was an interesting moment for our policy which says that you do not get to put up nude pictures of children. Most would agree that it is a good policy for an internet platform. But this was an iconic image that changed the world in his time. So we took the image back and allowed it. I can’t remember if this was taken away by a human being or happened automatically, but I’m guessing it was a man.
Read also: Facebook says that it stopped 1.5 billion trollkonton
content that provide many interactions, which means that the post that provokes or spreads discord is also seen more often and more clearly in the flow. It is said to have had a major impact in, for example, election campaigns. I ask what she thinks about it.
– Mark Zuckerberg recently talked about the content that is seditious, and on the border, but that does not violate our policy. We now see that the postings are degraded in a rankingskala so it does not get so many views and not highlighted as in the past. We are developing systems for it now. Just like in the real world it is sensational to get more commitment than other things, and we’re working on a ranking system for that content now.
In april last year announced Facebook their own guidelines for what content is considered inappropriate. Critics believe that it gave a result for those who want to exercise repression, but at the same time keep within the corporate limits of the law of acceptability.
” We have always been striving for more transparency, so it is important that people see what our rules are, to be able to challenge them. If you tell people who have decided to be abusive on our platform what the rules are, there is always a risk that they know how they should behave and get away with it. It was our only hesitation about it. But we decided to go out with it, and people were happy to see it. Transparency is always the first step towards improvement.
Jacob Lundström: On the ironically crusade with trollhögern
constantly evolving, not least the hate speech in which the expressions and symbols constantly get new meanings, for example, how emojier used to mock political opponents. I ask how they keep abreast with the hatspråkets linguistics.
We collaborate with many experts, for example, a professor who works with emojier. He supplies us with information that we can add in policies for our reviewers. We invite lecturers give a lecture on hate speech and counter-terrorism who talk about trends so that we should keep ourselves updated in those topics.