Removing large-scale networks of accounts has a different, but no less significant, effect. Changes to the information ecosystem reduce the amplification power of these groups; removing the networked faction of QAnon accounts ahead of the election is notable because they are a significant node in the new MAGA coalition. Without this network of superspreaders on Twitter, it will be more difficult to coordinate the manipulation of search engines and trending algorithms.
But even if they succeed in reducing the spread of conspiracy theories, these actions reveal the twin problems facing online platforms: some speech is damaging to society, and the design of social-media systems can compound the harms.
All these interventions come as Amazon, Apple, Google, and Facebook have been asked to testify in front of the House Judiciary Antitrust Subcommittee. The hearing–now delayed until Wednesday–is part of a series exploring “Online Platforms and Market Power” and will call Jeff Bezos of Amazon, Tim Cook of Apple, Sundar Pichai of Google, and Mark Zuckerberg of Facebook.
Republicans have sought to invite others, including Twitter CEO Jack Dorsey–but also an outlier, John Matze, the founder of the right-wing app Parler. Parler has built its brand on the back of claims that Twitter censors conservatives, and it recently went on a sprint to recruit Republican politicians. In July, Matze was a guest on a podcast that routinely features white nationalist and misogynist content and had been banned from YouTube in 2018 for hate speech. During the interview, Matze expressed pride that he provides a platform for those who have been removed from other platforms, such as Laura Loomer, Milo Yiannopoulos, and Jacob Wohl. On Parler, these figures have their content served alongside contributions from Republican figures the likes of Rand Paul, Ted Cruz, and Matt Gaetz, among others.
Research by me and my colleagues on the development of another app, Gab, which gained limited popularity by promoting itself as a safe haven for “free speech” following the white supremacist violence in Charlottesville, Virginia, illustrates the serious limitations of minor apps that provide alternative infrastructure for communities trafficking in hate speech.