Facebook Software based on artificial intelligence to recognize in the live streams on the platform, among other things, violence, has not responded to the Video of the massacre of Christchurch.
“to achieve this, we need to provide our systems with large amounts of data of exactly the kind of content – which is difficult, since such events are thankfully rare,” said the Online network on Thursday.
A further challenge for the Software to distinguish real violence from the Transfer of video game scenes. “If our systems would, for example, in the case of thousands of hours of live streams of video games, could miss our controller with the important Videos from the real world”, in which Facebook helper could alarm.
The assassin who killed on last Friday, 50 people in attacks on two mosques in Christchurch, new Zealand, and transferred the attack in real time when service Facebook Live.
hundreds of thousands of Uploads allowed
The company reiterated previous information that the 17-minute live stream of less than 200 users was seen, and the first user note reached 12 minutes after the end of the Transfer, the Online network. After the end of a live stream, a recording remains available.
still, it remains unclear how long the original Video of the attacker was online before it was removed from Facebook. The Online network has stated that the note would be faster, if someone had reported the Video during the live stream.
The original Video is around 4000 Times – the subsequent dissemination have contributed to the fact that multiple users had uploaded copies of other services.
Facebook’s Software blocked in the first 24 hours, while 1.2 million Attempts, the Video re-upload – due to left but also around 300’000 Uploads. The sun, among other things, the fact that you have to do it with over 800 modified variants of the video. (Dec/sda)
Created: 21.03.2019, 13:06 Uhr