Still, Stalinsky said he comes down on the “take it down” side when it comes to violent extremism. His perspective was forged by years of monitoring activity by Islamic terrorists. The Islamic State, in particular, became notorious for its strategic use of social media in the 2010s. “This might sound crazy, but if it were not for Twitter, ISIS would not have been ISIS,” he said. “They used it so effectively for recruitment for spreading their ideology, for growing.” After the 2014 murder of the American journalist James Foley, Stalinksy said, Twitter took the problem seriously and largely purged ISIS from its platform. Banished from major social media networks, the group migrated to Telegram and other chat apps.
Platforms have been criticized for years for treating white nationalism more leniently than Islamic extremism. To the extent that right-wing domestic terrorists use social media for recruitment, however, the last-minute moves announced in the past week are probably too late to have any impact on violence surrounding the inauguration. Recruitment, such as it is, has been going on for years. YouTube has been shown to make it easier for communities to form around radical right-wing viewpoints; Facebook’s recommendation algorithms have notoriously steered people into more extreme groups. It’s also tricky to analogize the Capitol rioters directly to ISIS. It’s an ad hoc alliance aimed at a particular, immediate goal—keeping Trump in office—rather than an ideological organization with fixed long-term ambitions. While some appear to belong to organized militias and white supremacist groups, many tributaries feed the “Stop the steal” river, including QAnon adherents, who are not inherently organized around violence, and people who simply believe Trump’s claims that the country is being stolen from them and feel motivated to act.
Indeed, providing a forum for lies about the election is probably the most important way in which social media platforms have contributed to the current atmosphere of political violence, and it’s also the one that is most obviously too late for any quick fix. Facebook and YouTube are shutting down accounts that repeat lies about a stolen election, but at this point tens of millions of Americans already believe those false claims. For the companies to have made a difference here, they would have had to start a lot earlier.
To be fair, in some ways they did start earlier. (Much less so YouTube, which tends to get away with being less aggressive about disinformation.) In the months leading up to and following the election, the companies made unprecedented efforts to steer users to accurate information and apply fact-checking labels to claims of electoral fraud. Those moves don’t seem to have been effective, but one can understand why the companies were hesitant to start taking down every post disputing the election results. It’s untenable for a platform of any real scale to police all false content—especially when it comes to politics, which is all about trying to convince voters to accept a certain version of reality. In an era of intense polarization, it isn’t always clear which lies will be the ones to spark violence until it happens.
It’s a mistake, however, to analyze social media’s culpability solely in terms of a binary decision to take something down or leave it up. The effect these companies have on discourse is much more deeply woven into their basic design, which prioritizes engagement above all else. To understand one way in which this plays out, I highly recommend a recent New York Times article by Stuart A. Thompson and Charlie Warzel. They analyzed public Facebook posts from three far-right users, including one who was part of the crowd outside the Capitol on January 6. All three, the authors found, started out posting normal stuff, to limited reaction. Once they shifted to extreme posts—whether it was encouraging “Stop the steal” protests, Covid denialism, or spreading false claims about rigged ballot counts—their engagement skyrocketed: more likes, more comments, more shares. More attention.