Disinformation is (Fragile) Power: Hate Online and How To Fight It

03 05 19

Politics today can feel like it has reached saturation point when it comes to the spread of falsehoods online. From fake news being spread on social media to try and influence electorates, to the surge in conspiratorial narratives, it can lead to a temptation to tune out entirely from the web.

Though understandable (and, no doubt, people must take breaks!), this response is neither feasible nor wise ultimately if we are to confront the spread of hate. Indeed, this despondency can lead us to overlook the fact that we are better equipped to fight back than ever. Disinformation (false information spread deliberately) plays a role at every step in shaping the spread of hate today. This is not new, of course – propaganda has long been a part of political tactics – yet its current centrality is hard to ignore.

Moreover, it plays a crucial role from beginning to end. From radicalisation to action falsehoods lurk, offering people explanations which they may come to believe and, at the extremes, engage in violence to further spread. Three recent stories demonstrate this yet, on closer inspection, though they signify hate’s propagation, they also highlight routes for anti-fascist intervention. What’s more, combined, they can have a significant effect on stemming the spread of hateful false information.

Alt-Right Pipelines

Online disinformation, and the rabbit holes it can open up, is central to how many are now drawn into the far right, as one man’s recent online confession highlighted. His story also, however, tells us something about how encountering the truth from a voice you feel speaks to you can help block these radicalising pipelines.

Caleb AKA ‘Faraday Speaks’, is a twentysomething man from West Virginia who gained attention online recently after uploading a video detailing how he was slowly attracted towards alt-right ideas in 2014 and began to reject them in 2018. Careful consideration of people’s sincerity is always necessary in these circumstances, but Caleb has stated he understands if people are wary and accepts it will be some time before he has rooted out all of the propaganda he has absorbed. What is important here is that he has begun to document in detail what caused him to come to believe far-right ideas, and has shared this to help others understand what’s needed to stem the problem.

Whilst he draws attention to multiple factors in his personal life that are no less vital to understanding how he or anyone else could become more prone to radicalisation, he draws particular attention to how seeking out advice on social media led him to slowly accept progressively more extreme ideas through algorithmic suggestions.

Caleb describes how in 2014 he went back to browsing YouTube for hours, something which had been comforting for him in high school, but that this time he used the site to try and address mental health issues he had developed after dropping out of college. Amongst regular self-help videos he came across the far-right vlogger Stefan Molyneux, in a video that Caleb recalls as addressing the topic of depression and self-help. Molyneux, whose community of supporters have been described as a “cult” and who has encouraged a practice of family estrangement, has made multiple videos on mental health. These include videos perfectly tailored to someone in Caleb’s previous situation, such as ‘Depressed in College? Here’s Why!’ and ‘Are you lonely?’ Through Molyneux, Caleb explains how he entered a pipeline of engaging initially with the vlogger’s political and social commentary, and interviews with extreme figures, before moving onto ever more extreme vloggers who began to appear as guests on his show or as suggestions on the sidebar.

A network map of guest appearances from Data & Society’s Rebecca Lewis whose Alternative Influence report investigated the networks which draw people into extreme, far-right content

Caleb’s story is not unique but, in its explanation of being led down online rabbit-holes, it encapsulates what now appears to be the trajectory of contemporary radicalisation into the far right for many. He draws attention to how people are slowly inculcated into this worldview through the drip-feeding of hate, conspiracy theory and disinformation online from a variety of actors who may have various interests (anti-Muslim politics, anti-feminism, anti-immigrant politics etc.) but who have enough overlap with one another to draw people to these other views. As Caleb describes it, “[The alt-right] is a decentralised cult, that sucks you in […] There’s no formal leadership […] [but there are] so many people down the layers of this.”

This illuminates an important feature of this process, which may be less prominent in certain traditional radicalisation experiences: its distinctly fragmented nature. Caleb notes how you “take one piece” of this decentralised hate, and it will lead you into other parts: “This is how you become radicalised. You don’t become radicalised because some nazi walks up to you in the street”, rather they (that is, this broad online network) convince you through a “pseudo-rationality” and prey upon your “biases” and “circumstances” and “identity”.

However, in an interview with the site It’s Going Down, Caleb explained how he believes the tide can be turned. Vloggers on YouTube, such as Destiny and ContraPoints, who make videos debunking alt-right ideas, are what led him to start to realise he’d been fed masses of disinformation. As he crucially notes, however, such figures have been playing catch-up. The ‘edgy’ trolling culture associated with the likes of 4chan but also deeply embedded in parts of gaming culture, slowly normalised, for a generation, exposure to extreme imagery, language and ideas.

This was perfect for the far right to capitalise on in their efforts to promote reactionary politics to young people online looking for transgression in any form. Along with being fragmented, this understanding of certain youth cultures is, likewise, a distinctive feature of much contemporary, far-right radicalisation. Nonetheless, there is nothing predetermined about this. Progressive voices online (who perhaps, in an indictment of where things have got to now, are transgressive in their rejection of hate) can gain ground. Importantly, as Caleb notes, the particular progressive figures that got through to him were those who understood how to communicate their positive messages to people who’ve grown up through internet culture, and this is why he listened to them; as he put it, they “spoke [his] language”.

Wiki Wars

For those who get drawn into this online world of hateful propaganda, they may take the further step of becoming involved in organised far-right politics, a realm where disinformation remains central. In particular, as far-right groups grow and gain larger, new audiences, they must commit more time and energy into ensuring the information that is available about them presents them in a less extreme light if they are to continue to attract new supporters. The recent, concerted efforts of a US far-right group to control their presentation on Wikipedia exemplifies this.

Discord chat leaks from non-profit media organisation, Unicorn Riot, have revealed that the white nationalist group, American Identity Movement (AIM) (previously ‘Identity Evropa’, or IE) have, within the last year at least, been engaged in a coordinated ‘Wikipedia Project’ to regularly edit their page on the site to present themselves more favourably when people first look them up. The effort has been led by a member with the Discord moniker ‘@Steve – NJ’. As he told others in July 2018:

Hi all – as Matthias mentioned in the recent announcement, we could always use an extra hand from people willing to put some time into building a Wikipedia account to help with maintaining the IE page. Our Wikipedia page is one of the first things that come up on a Google search, and first impressions are always very important.

Currently I am fighting to get the “Neo-Nazi” descriptors off our page, so it won’t appear as it currently does in the picture below:

I am making a lot of progress – but the more established accounts I have supporting my case, the easier it is to make things work out in our favor. Send me a DM if you are interested.

The importance of this for AIM/IE was highlighted by a post to all members from the group’s ‘Chief of Staff’, Matthew Robert Warner (AKA ‘Matthias’) in the same month:

Attention @everyone

I’d like to remind everyone, and notify new members, that there are many ways to get involved in IE behind the scenes that may not be apparent right away. For one, many of you have requested that we fight to correct our Wikipedia page. We have made progress here, but require more editors who are willing to bolster their profiles across Wikipedia.

Furthermore, in August the user ‘Prestor John’ reiterated that the group’s leader, Patrick Casey (AKA ‘Reinhard Wolff’) had made this a “top priority”.

Nonetheless, AIM/IE recognise that this is a long-term project that requires investment of time. As Warner noted in October 2018, “Mass editing [their] page, or any controversial page, will just result in it getting locked” and instead “If you want to see the first and most common source on IE improved, you will have to […] develop a little trust and credibility in the editorial community of Wikipedia and then win disputes”. Indeed, this effort is enough of a priority for AIM/IE that in March of this year Casey stated that, in addition to a fundraising campaign for doxxed members, the organisation’s other fundraising campaign was the “Wikipedia Project”. As Unicorn Riot note, the group even discussed the possibility of investing funds into paying people to edit their Wiki page.

Whilst the extent and nature of AIM/IE’s editing of their page is unclear at present, what is clear is that getting the name of their Wikipedia page changed from ‘Identity Evropa’ to ‘American Identity Movement is crucial for them. The discussion between users editing the AIM/IE Wikipedia page recently shows considerable dispute over which is the correct name to use for the page. Whilst we cannot be sure whether those editing the page to remove association between the two are from the far-right group, it cannot be ruled out.

Members of Identity Evropa at the Unite the Right rally, 2017

Moreover, the group are, no doubt, keen to distance themselves from their original name. As the Southern Poverty Law Centre note in relation to the rebranding, Identity Evropa’s involvement in the Charlottesville Unite the Right rally in 2017 which saw the death of anti-fascist activist, Heather Heyer, has left them with violent associations and legal issues. The Unicorn Riot leaks have confirmed that, in practice, it is the same organisation, however. They reveal how members’ dues have carried over to the “new” organisation, and that after members asked on their Discord, “do we need to apply for AIM if we were already in IE?” they were told “Nope, if you want to leave, you can just let us know, otherwise we’re gonna assume you’re on board.” Given this, any concerted efforts to remove association between the two online by AIM/IE’s members would be a clear effort at disinformation.

As with our understanding of alt-right pipelines online, what we can take from the torrent of media manipulation engaged in by the far right in recent years is a deeper understanding now of, not only their tactics and how to counteract them, but also the importance for the far right today in controlling the information people are exposed to when they look them up. Drawing links between these actors and editors on Wikipedia and campaigning against their coordinated disinformation, is a vital way to block the spread of their ideas.

Dangerous Data Voids

In addition to radicalisation into the far right and far-right PR efforts, disinformation also plays a role at the extremes of far-right political action, as the recent Christchurch mosque attack showed. As has been widely reported, it is evident that the attacker planned and engaged in a concerted effort to maximise media coverage of the ideology motivating his attacks. It was a “terrorist attack conceived for the internet era”, as Foreign Policy’s Elias Groll described it, precisely because the intention was not only to cause harm and raise fears, but to encourage the spread of a conspiratorial, false theory online in the aftermath.

As my colleague, Patrik Hermansson, wrote following the attack, “It’s important to understand that far-right terror doesn’t end when the last bullet has been fired. The ensuing media coverage and virally replicated memes are also often part of the perpetrators plan to sow division and hatred.” Patrik also drew attention to how the issue of ‘data voids’ online play a role in this, a term coined by danah boyd and Michael Golebiewski in 2018. As they describe in their primer on the subject, a data void is a search term for which “the available relevant data is limited, non-existent, or deeply problematic.” In this case, most prominently, the attacker attempted to get people searching for ‘The Great Replacement’, a conspiracy theory that white people are being ‘replaced’ by non-white people, especially non-white Muslims.

A look at the phrase’s worldwide Google search interest since 2012 to now show how searches immediately following the attack, and the subsequent spread of the shooter’s manifesto online, resulted in its peak search popularity to date.

Though the phrase originated in a 2012 book by French writer Renaud Camus, who was already well-known in France, and had gained traction slowly in parts of the European far right, it had not broken through to considerable attention in the English-speaking world. The potential advantage for the far right of capturing and manipulating a void in public knowledge for their gain could not be clearer, therefore. Yet, it also shows just how much they depend on those who can debunk their lies not filling these voids with the truth. As Patrik noted in the aforementioned article, whilst not reporting on these ideas at first seems the best response, in today’s online world where information – truthful or otherwise – is easily accessible, ceding “the space where far-right ideas could be contextualised and critiqued to someone less competent or the far-right themselves” will likely result in “a much more favourable, and dangerous, take on violent ideas.”

Disinformation is (Fragile) Power

As I noted at the start, it has unfortunately become all too easy to become acclimatised to the spread of hateful, false information. Just recently, a British Conservative MP used the term ‘Cultural Marxism’, a widely popular dog-whistle in the far right with deeply antisemitic roots. Whilst there was a swift response to condemn this, it highlights how much we’ve got used to the mainstreaming of hateful ideas, terms and disinformation, that this was not a bigger story than it was. (The worrying speed of normalisation around the ‘replacement’ narrative in the US was similarly highlighted by Salon’s Amanda Marcotte following the recent Poway synagogue shooting, whose perpetrator subscribed to the conspiracy).

This despondency, however, can lead us to overlook the fact that we know much more now about how to fight back and I hope the three cases addressed above highlight this. Through small, personal efforts, but especially carried out collectively, we can intervene to stop the spread of hateful disinformation online.

This can involve blocking radicalisation pipelines on YouTube and elsewhere by campaigning for the removal of figures creating and profiting from these lies, and the support of progressive, accurate alternatives to them who have an understanding of online cultures. It can also involve ensuring the fifth most popular site and most popular online encyclopedia is not being manipulated in a concerted way, by organising and taking part in Wikipedia ‘edit-a-thons’ and reporting evidence of editors being recruited to bias entries (as has happened with the Identity Evropa/American Identity Movement Wikipedia page).

Where, in the worst case scenario, a data void filled only with the far-right’s disinformation has been allowed to grow online – by virtue of their voices being given a platform for too long – crowding out their lies with accurate explanations which place their propaganda in context can undermine its further spread. As danah boyd highlighted in a recent talk, it is not enough to just put the accurate information out there, we have to “understand the networked nature of the information war we’re in” and “actively be there when people are looking”.

Disinformation can be a powerful strategy for the far right, but it is a fragile power to wield. It depends on those of us fighting back lacking an understanding of how it is they are manufacturing and spreading hateful lies in new ways, and whilst we should never rest on our laurels, there are many lessons we can now learn to fight back more effectively.

Simon Murdoch
https://twitter.com/SimonMurdochHNH

SHARE THIS PAGE

Stay informed

Sign up for emails from HOPE not hate to make sure you stay up to date with the latest news, and to receive simple actions you can take to help spread HOPE.

Popular

We couldn't do it without our supporters

Fund research, counter hate and support and grow inclusive communities by donating to HOPE not hate today

I am looking for...

Search

Useful links

                   
Close Search X
Donate to HOPE not hate