![]() ![]() ![]() Many question why Facebook in particular wasn’t able to more quickly detect the video and take it down. ![]() The video’s rapid spread online puts renewed pressure on Facebook and other social media sites such as YouTube and Twitter over their content moderation efforts. “We continue to work around the clock to prevent this content from appearing on our site, using a combination of technology and people.”įacebook has previously said that in the first 24 hours after the massacre, it removed 1.5 million videos of the attacks, “of which over 1.2 million were blocked at upload,” implying 300,000 copies successfully made it on to the site before being taken down. “No users reported the video during the live broadcast,” and it was watched about 4,000 times in total before being taken down, Sonderby said. Fifty people were killed at two mosques in Christchurch.įacebook removed the video “within minutes’” of being notified by police, said Chris Sonderby, Facebook’s deputy general counsel. It said the gunman’s live 17-minute broadcast was viewed fewer than 200 times and the first user report didn’t come in until 12 minutes after it ended. ![]() The social media giant released new details about its response to the video in a blog post. LONDON > Facebook says none of the 200 or so people who watched live video of the New Zealand mosque shooting flagged it to moderators, underlining the challenge tech companies face in policing violent or disturbing content in real time. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |