Nearly two months after the terrorist attacks in New Zealand, Facebook has announced measures that it hopes will stop the live broadcast of similar content. The announcements came through a blog post by Guy Rosen, vice president of the company and content officer, who said the platform would punish users for a certain period of time when a violation of the company’s policies was observed.
“From now on, anyone who violates our most serious policies will be exempt from the ability to use the Live service for specific periods of time -e.g. for 30 days- beginning with his first misconduct. For example, if someone shares a link (no other content) of a statement from a terrorist organization then it will automatically be blocked from using Live“,
says the Facebook executive. At the same time, he added that there will soon be similar restrictions for users who create similar ads.
However, it is unclear how long it will take Facebook to find a video that will be live and fall into that category. Also, it has not been clarified what will be the treatment of offenders after the exclusion period or whether something else will happen if the offenses continue after the exclusion period. It is worth recalling that the conversation about Facebook’s live content broadcasts opened after the attacks on Christ Church of New Zealand.
Then, according to what Facebook itself announced, there was a video of the attack that was screened 4,000 times before it was finally denounced after 29 minutes. Overall, just hours after the attacks, about 1.2 million videos were removed, although copies continued to circulate up to 12 hours after the incident. Indicative is that live broadcast was reported to Facebook 12 minutes after its end, while the system failed to prevent up to 20% of videos uploaded by users after the event.
In today’s announcement there is a presumption that Facebook actually failed to find some videos as they were edited. However, the excuse seems to suffer from the moment the company is stalking the use of Artificial Intelligence and Mechanical Learning to keep malicious content out of Facebook.