By: Willem Nesbitt
With this week’s readings and topic covering the rise of QAnon, an obvious focus was given to how and why such a conspiracy was able to emerge and propagate across the globe. Placing blame on the ongoing pandemic is an obvious answer, which, as Oxford researcher Johnathan Bright points out; “People are spending even more time online, so have more time to come across anti-vaccine and other conspiracy content.”
With past weeks in our class discussing ideas surrounding how nationalist and right-wing movements have managed to transcend borders in an ironic sense, the internet is most certainly the obvious leading culprit in the ability for those groups, and now conspiratorial movements, to have a wider reach. Beginning quietly on 4chan’s /pol/ board, whether ironically or unironically, that very board fostered and promoted the central Q poster and their adherents, eventually spreading their message to the more mainstream realms of YouTube, Reddit, and Facebook.
With 30% of surveyed Republican voters believing in the central core tenants of the QAnon conspiracy (and an alarming 43% being “uncertain”), it is obvious that the internet has helped spread this conspiracy far beyond the confines of imageboards. This results in a question – do social media sites, whether they be Twitter, Facebook, YouTube, or others, have a responsibility to curtail and remove posts regarding conspiracies such as QAnon? Following the temporary shutting down of right-wing social media site Parler earlier this year, a debate erupted over the ideas of “free speech” on the internet, and many on the left believe that these other sites are still not doing enough to prevent the spread of these conspiracies.