NYTimes | On Wednesday, one week after the school shooting in Parkland, Fla., Facebook and YouTube vowed to crack down on the trolls.
Thousands of posts and videos had popped up on the sites, falsely claiming that survivors of the shooting were paid actors or part of various conspiracy theories. Facebook called the posts “abhorrent.” YouTube, which is owned by Google, said it needed to do better. Both promised to remove the content.
The companies have since aggressively pulled down many posts and videos and reduced the visibility of others. Yet on Friday, spot searches of the sites revealed that the noxious content was far from eradicated.
On Facebook and Instagram, which is owned by Facebook, searches for the hashtag #crisisactor, which accused the Parkland survivors of being actors, turned up hundreds of posts perpetuating the falsehood (though some also criticized the conspiracy theory). Many of the posts had been tweaked ever so slightly — for example, videos had been renamed #propaganda rather than #hoax — to evade automated detection. And on YouTube, while many of the conspiracy videos claiming that the students were actors had been taken down, other videos that claimed the shooting had been a hoax remained rife.
Facebook faced renewed criticism on Friday after it was revealed that the company showcased a virtual reality shooting game at the Conservative Political Action Conference this week. Facebook said it was removing the game from its demonstration of its new virtual reality products.
The resilience of misinformation, despite efforts by the tech behemoths to eliminate it, has become a real-time case study of how the companies are constantly a step behind in stamping out the content. At every turn, trolls, conspiracy theorists and others have proved to be more adept at taking advantage of exactly what the sites were created to do — encourage people to post almost anything they want — than the companies are at catching them.
0 comments:
Post a Comment