Populism and Media 

This week’s sources centered on the role of media and its construction of conditions for populist formulation, in particular now with the global network society. I think this week’s focus ties in really well to some of the themes explored last week regarding online conspiracy theories. The network society and social media platforms have changed messaging completely – from who it comes from to how quickly it spreads. 

As mentioned in Des Freedman’s “Populism and Media Policy Failure”, media failures have contributed to the rise of populism. This has happened as a result of far-right populist politicians and movements securing high levels of visibility thanks to often complicit media outlets and unregulated digital platforms. 

The networked society is different than the legacy media outlets which came before it as it provides a level of interconnectedness that the world has never seen before. Anyone and everyone have access to sharing information online whether it be true or not. Legacy media was much more focused on reputation and providing credible information. Whereas now, big tech companies are more interested in clicks and profit and are not held accountable for the spread of misinformation on their platforms. 

As can be seen with populists like Trump or Le Pen, mass media and online platforms give them a platform that would not have had previously which has allowed them to reach more people. Having this platform where your ideologies can reach thousands in a second, paired with the lack of fact-checking online is a dangerous combination. 

Based on the Digital Services Act, there are steps being taken in order to combat the spreading of false information to manipulate people. However, I can’t help but think that the online world has become so complex and so fast-paced that no legislation would be enough to keep up with it. 

Digital Mayhem Fuels the Seeds of Far-Right Youth Extremism In The UK

Jake Rooke

The pandemic of lies online is costing lives. The pandemic has also increased anti-social behaviour from extreme radicalization which will extend long after the lockdowns. The UK’s youth seem to be a target audience.

Parents, teachers and groups such as Hope Not Hate, the Expo Foundation, and the Center for Countering Digital Hate are concerned that British youth returning to school next week have been exposed to extremist content online. The fact is, the far-right is exploiting the interchangeability and complexity of the online sphere. This interchangeability has fostered an everchanging lingo and glossary of new far-right symbols.

            A neo-Nazi teenager, that set up FKD GB, a splinter group of banned National Action, was convicted of terrorism offences in January 2021. This teenager was radicalized and groomed online through forums such as Telegram, 8Chan (now 8kun) and GAB. The teenager has also influenced other youths, including Paul Dunleavy, 17, from Rugby, who was jailed last year for preparing acts of terrorism.

The online radicalization of youth has only ballooned during the pandemic.

Civil society and not-for-profit organizations, such as Owen Jones’ Hope Not Hate have taken the lead in battling online radicalization. Jones recently created a guide-book for teachers that focuses on new age internet-driven neo-Nazi symbols, logos, memes, and a appendix of common terms used by the far-right online.

The Expo Foundation, 2021 State of Hate: Far-Right Extremism In Europe, reported on a complex interwoven web of conspiracies that the far-right is promoting online in the UK. These include traditional anti-Semitic conspiracies with a contemporary twist, such as Covid-19 and 5G being Jewish plots. Other conspiracies include the typical anti-immigrant schemes, such as the government and the elites’ promotion of the ‘Great Replacement’ of white Britons. This was widespread in 2020 during BLM. Other simpler conspiracies include ‘immigrants bring diseases’ and Britain is being ‘invaded’ by illegal migrants crossing the Channel in little dinghy boats.

Although we can thank organizations such as Hope Not Hate and the Henry Jackson Society for their commitment to tackling radicalization and terrorism online, the UK government and Big Tech need to step up. The government’s ‘Prevent Strategy’ is outdated for the digital era, focusing on groups, instead of individuals and ignoring realities that Big Tech has a large role to play in preventing hate and radicalization of youth online.

The Center for Countering Digital Hate powered by young people, indicates in a study, major social media companies are not doing enough to tackle misinformation and radicalization online. The study indicated that out of 756 examples of misinformation on Facebook, Instagram, and Twitter, only 9.4% were removed. It is quite clear that Facebook, with a stock increase of 23% since the end of 2019 is running off like a bandit, while grassroots groups, civil society, teachers, and parents are fighting the good fight.

Back in 2019, Sacha Baron Cohen described Facebook as the ‘greatest propaganda machine in history’, arguing that the company, which does not vet political ads for truthfulness, would have allowed Hitler to run propaganda on its platform. Google’s YouTube isn’t much better, riddled with far-right ‘stars’, such as Paul Joseph Watson, Stephen Yaxley-Lennon (aka Tommy Robinson) and recently, two silly blokes, Alan Leggett and Nigel Marcham, who watch out for Channel dinghy boats. YouTube also allows these individuals to promote their far-right pages and platforms such as Telegram, festering right-wing extremism into an echo chamber.

As a result of the lockdowns, British youth have spent much of the last year stuck inside, online, doom-scrolling. When a teacher assigns a project on the Holocaust or a study on Islam for instance, a student, researching online largely unsupervised could be exposed to extreme right-wing misinformation. This innocent exploratory process can lead a student to an alt-education, with Holocaust deniers and as well as far-right podcasters promoting extreme radical views on youth. Social media, search engines, and alt-platforms use algorithms, which then push impressionable youth down a rabbit-hole, increasing susceptibility of grooming by radicals. Moreover, many young boys, feeling isolated have turned to the incel movement online, which holds dangerous views on women.

Some recommendations include a research initiative by the Commission for Countering Extremism to examine the most effective ways to counter the distribution of online extremist content on alt-tech platforms. Additionally, regulatory agencies such as the Office of Communications and the Independent Press Standards Organization should be reformed and given more responsibilities.

Coming out of lockdown, it cannot be business as usual. No generation has had such an overabundance of information, nor has any generation in the digital age been subjected to a pandemic and numerous lockdowns. The combination, with a lack of government oversight online is a fertile environment for the radicalization of UK youth.

A Dangerous Game of Hide and Seek: Hate Groups Are Using Social Media as Their New Favourite Hiding Spot

By: Andreea Gustin

We often hear that history has a tendency to repeat itself. As memory fades, events from the past can become events of the present. If recent events are any indicators, American society is inching dangerously close to mirroring events of a century ago – only this time, with a modern twist. Technology and digital media have revived the rhetoric of authoritarianism, fascism and populism. But how is it being used to extremists’ advantage? 

Last week, the Southern Poverty Law Center (SPLC) released their annual report which showcased that the number of active hate groups in the United States has fallen by 11 percent in the past year. In 2020, the recorded number of active hate groups was 838, compared to 940 in 2019. Although it may appear that the number of active hate groups in the U.S is decreasing, SPLC attributes the drop to the fact that technology and digital media have made these groups harder to track and diffuse. In addition to this, the current COVID-19 pandemic has played a role in limiting in-person activities which has only further driven hate groups onto digital platforms. 

The evolution of smartphones, social media, podcasts and livestreams has made being an extremist a mobile, multimedia experience. Technology has made it easier than ever for extremists to recruit new followers and push their fringe beliefs into the mainstream. This was on full display on January 6, when hundreds of white nationalists’ groups, that had primarily used the internet to organize, stormed the Capitol. Many members of these groups had met online before the event, and their attack on the Capitol showed their alarming capacity for offline violence.

Following this event, social media platforms like Facebook., Twitter and YouTube have all been making a public effort to crack down on extremist content. Despite these efforts, hate groups are now migrating to new platforms like Telegram and Signal, which provide little or no content moderation. Neo-Nazi’s and far-right groups have historically found ways to leverage technological trends in order to find ways to spread hate and organize online. For example, white supremacist groups in the 1990s turned to what was then considered advanced platforms like Stormfront and the Daily Stormer, to spread white nationalist ideas. This ultimately led to the emergence of imageboards, memes and “trolling” – all elements we still see online today. 

The problem here is not only about trying to understand how these hate groups are using technology and digital media. It’s also a matter of trying to understand what this means for our future as it relates to our past. As we’ve increasingly seen over the past four years, the alt-right’s racist messaging, white nationalist underpinnings and anti-immigrant and anti-Semitic sentiments are no longer only showing up in the streets as they once did. Social media has created channels for Neo-Nazis and extremist hate groups to organize and manipulate information to their advantage. 

Recent demonstrations of extreme nationalism and the threats posed to American democracy are drawing comparisons to a dark past. Although certain historical themes of nationalism and authoritarianism are coming up in today’s conversations, many do not understand the alarming power of technology in the current circumstances. History may very well repeat itself, but are we prepared to deal with elements of the past in today’s faceless digital world? 

It’s easy to make comparisons to the past, but it’s difficult to understand that that is no longer the same world we’re living in. Technological advancements and social media have created new challenges and obstacles to tracking hate groups and holding those involved accountable. The methods once used to combat dangerous nationalist efforts are no longer applicable to domestic online extremism. 

It is only natural for us as humans to attempt to understand modern issues by applying the lens of the past. However, there needs to be a greater understanding of how fascist and nationalist ideologies have developed over time and what role technology plays in these developments. Ultimately, it’s important for us to understanding how our modern issues can differ from those of the past and how this can lead to new consequences not outlined by history. 

References

The Associated Press. (2021, February 1). Report: Hate groups in decline, migrate to online networks. NBC News. Retrieved February 10, 2021, from https://www.nbcnews.com/feature/nbc-out/report-hate-groups-decline-migrate-online-networks-n1256356

Bensinger, G. (2021, January 13). Now social media grows a conscience? The New York Times. Retrieved February 10, 2021, from https://www.nytimes.com/2021/01/13/opinion/capitol-attack-twitter-facebook.html

Hatewatch Staff. (2019, September 18). Daily Stormer website goes dark amid chaos. Southern Poverty Law Center. Retrieved February 10, 2021, from https://www.splcenter.org/fighting-hate/extremist-files/group/stormfront

Janik, R., & Hankes, K. (2021, February 1). The year in hate and extremism 2020. Southern Poverty Law Center. Retrieved February 10, 2021, from https://www.splcenter.org/news/2021/02/01/year-hate-2020

Jimenez, C. (2021, January 20). Far-right extremists on social media aren’t going away — they’re hunkering down. Colorado Public Radio. Retrieved February 10, 2021, from https://www.cpr.org/2021/01/20/far-right-extremists-on-social-media-arent-going-away-theyre-hunkering-down/

McEvoy, J. (2021, January 7). Capitol attack was planned openly online for weeks—police still weren’t ready. Forbes. Retrieved February 10, 2021, from https://www.forbes.com/sites/jemimamcevoy/2021/01/07/capitol-attack-was-planned-openly-online-for-weeks-police-still-werent-ready/?sh=622babfb76e2

Molla, R. (2021, January 15). What is Signal, and why is everybody downloading it right now? VOX Media. Retrieved February 10, 2021, from https://www.vox.com/recode/22226618/what-is-signal-whatsapp-telegram-download-encrypted-messaging

Molla, R. (2021, January 20). Why right-wing extremists’ favorite new platform is so dangerous. VOX Media. Retrieved February 10, 2021, from https://www.vox.com/recode/22238755/telegram-messaging-social-media-extremists

Stormfront extremist group info. (n.d.). Southern Poverty Law Center. Retrieved February 10, 2021, from https://www.splcenter.org/fighting-hate/extremist-files/group/stormfront