Facebook deletes Alex Jones, and other figures of the far right because they “promote violence,”Zuckerberg turns now to the privacy the center of the strategy of FacebookFacebook is a cemetery in 50 years if no one remedies

The strategy of Facebook to avoid the misinformation in the european elections revolve around the detection of fake accounts, agreements with external verifiers and management of the political advertising , according to has informed the company of Mark Zuckerberg.

In an event organized at the Center of Operations and Data that Facebook are in Dublin, the company has met the international press to raise the visibility of the space from which it directs the monitoring and removal of content detrimental to the electoral process .

In line with the tools that were presented in the face of general elections of the 28 of April in Spain, the library of ads on the social network will have a key role to control the propaganda that the political parties distributed in it, and, therefore, of the information received by the users directly.

“The tools of transparency regarding the advertisements may pose a greater challenge for politicians because they are creating intermediate steps to those who are not accustomed to,” explained the director of Public Policy for Facebook, Richard Allan .

Those who do not register on them and provide to Facebook the information strictly necessary to advertise will see their advertisements silenced (at least temporarily), as has already happened to Citizens and to the PSOE before the general elections in Spain .

however, the lack of regulation that you define what it contains exactly a political advertisement generates some barriers to remove false content, or manipulator, as it has recognized another of the directors present at the meeting, Tom Reynolds, of the Communication team of the electoral area.

In this way, a user will not be able to share in a post a wrong date of the holding of the elections because it will be withdrawn, but a party may discredit his opponents for the policies they carry out, for example.

information that simply is not true will not be removed from the platform “, has reiterated Allan.

For the moment, the aim is to “put material in the public domain so that everyone can see what is happening”, has specified Reynolds, as not all of the false content will be removed.

Challenges to solve

Reynolds has assured that the “lines disruptive” in the content of the network are constantly monitored by algorithms, but even so, hoaxes and false information will not always be removed from the platform , simply scored way lower and not be shared in bulk, for example.

To identify that kind of information, it also counts with the collaboration agreements that have been established with a score of european organizations check in 14 different languages, managed primarily by the media.

In the Operations Centre of Ireland is a study where the peaks of conversation around a particular term (immigration, for example) and the accounts that generate the content thereon.

“Normally we do an analysis of each country particularly, but the European Parliament entails a complex exercise , and there is no regulation in this area to clearly define what are the political issues susceptible of being false, so that we make the most of our party,” added Allan.

therefore, to analyze the trends of a political nature, the company is guided, for example, by the issues which dictate the Eurobarometer: “we Recognize that is not ideal,” he insisted Allan.

The social network of Mark Zuckerberg wants to focus also on analyzing the origin of the users and the networks of pages that participate more actively in the political conversation and determine the veracity of these accounts, allowing us to eliminate those that were categorized as “actors disruptive”.

In this category, WhatsApp, also managed along to Instagram by Facebook, where you have already implemented a limitation on the number of contacts with whom you can share a message at the same time (five).