Conspiracy theory thrives on social media. Inevitably contentious by their nature, they attract attention from both believers and those determined to debunk the ideas. Then there are those who simply find them entertaining, consuming and sharing conspiracy theory content as they would anything else that they found engaging. One of the world’s most famous conspiracy theories, and the editor of InfoWars, Alex Jones, is an eccentric character whose rants and unbelievable theories provide a regular stream of shareable content. A quick search on his name will provide multiple articles with headlines such as “Alex Jones’ 5 most disturbing and ridiculous conspiracy theories” originating from reputable sources.
Conspiracy theories and those who spread them are, in other words, often well-suited for the age of social media. The side effect is that characters like Alex Jones are given much bigger platforms than they should and with it, the ability to reach ardent believers. All of the large social media platforms aim to maximise the time users spend on their platform, leading techno-sociologist Zeynep Tufekci to conclude that attention is “the crucial resource of the digital economy”. Algorithms mediate our communication on social media platforms and prioritise content that is attentiongrabbing, even though it is this type of content that is often in direct conflict with accuracy.
Conspiracy theory-oriented groups and channels are easily found on most of the major social media platforms and many contain incredibly active communities with hundreds of posts per day from all over the world, discussing and sharing new ideas in the world of conspiracy theory. In the last weeks, conspiracy theory groups related to the ongoing pandemic have attracted a significant number of new members on Facebook. The largest 5G conspiracy group in the UK, STOP 5G UK, added almost three thousand members in just 24 hours between the 6th and 7th of April and had almost 60,000 prior to its deletion by Facebook.
A related conspiracy theory, that the coronavirus is actually a hoax became the centre of a #FilmYourHospital campaign that urged people to film outside their local hospitals. The intent was to use the possible lack of activity as proof that the coronavirus was not in-fact real, neglecting that much other routine activity at hospitals has been suspended and that care for patients doesn’t generally take place in hospital’ car parks. The resulting videos of nothing happening have been widely shared amongst groups sceptical about the current pandemic.
HALF-WAY IS NOT ENOUGH
It is true that mainstream news outlets have on occasion provided a platform for conspiracy theorists and allowed them to amplify their conspiratorial worldview. David Icke is an example of a conspiracy theorist that in recent years has been featured on outlets such as Talkradio multiple times. Recently, e was featured in a one hour and 45-minute interview on Freeview channel London Live as late as last week to suggest the corona pandemic is a hoax. However, it is nevertheless true that by and large the gatekeepers and editorial standards of traditional outlets have blocked these routes for conspiracy sowers, and so it is social media outlets that have given them a platform to an even greater extent.
Social media outlets provide a way for even the most obscure and harmful conspiracy theory to get a platform. British far-right conspiracist Paul Joseph Watson, for example, spread harmful ideas about Muslims, the Sandy Hook mass shooting and was one of the individuals that popularised the idea that the bushfires in Australia 2019 was predominantly caused by people intentionally lighting fires. He has multiple times denied climate change, suggesting that global warming is not caused by humans. Watson has 1.8 million subscribers on YouTube and 1.1 million followers on Twitter.
The large number of followers of a character like Watson is an indication that the steps taken to tackle misinformation by social media platforms are far from enough. Their moderation practices let a significant amount of content through and fails to recognise the harm caused by conspiracy theories. Although YouTube, for example, has made attempts to limit the spread of conspiracy theories such as modifying its algorithm to avoid recommending the most extreme videos and supplying links to accurate information from Wikipedia, the videos often remain on their platform. The attention-grabbing aspects of conspiracy ideas mean they are still shared extensively in the comments section and on other sites.
The pandemic has however provided an indication that the platforms themselves are aware that their current moderation practices are not enough and how a hands-off approach to the spread of information has its limitations. Searches on Google for “coronavirus” now presents a curated search results page with information from health authorities rather than an algorithmically generated list of search results and ads. Twitter, Facebook, Instagram and Youtube are also showing modified search results, banners directing users to public health advice next to content related to the virus and prompts for users to socially distance and seek advice from relevant authorities.
DIRECT MESSAGING APPS
Although, debate often focuses on platforms such as Twitter, Facebook and Youtube as the main purveyors of misinformation a sometimes-over-looked factor in the UK is the role of messaging apps. The reason is simple, platforms such as Youtube and Twitter are effective broadcasting platforms where a message can reach large audiences with no cost to the broadcaster. This is how British conspiracy theorists like Paul Joseph Watson have garnered a large following.
Whatsapp is one of the most used social media apps in the UK but works quite differently from the previously mentioned platforms. The largest group supported on Whatsapp is 256 people, compared to the unlimited follower count a user can have on YouTube and Twitter or group size in Facebook It is not a broadcasting platform but an intimate communications platform for the people you likely already know or have some connection to in real life. It might be family members, work, a sports club, neighbours or people engaged in the same hobby as yourself.
Despite this, Whatsapp has over the last years been used to spread conspiracy theories and misinformation. In India, the platform has been the subject of an investigation after viral messages have spread across private groups, accusing individuals and groups of abducting and harming children. In July 2018 five people were killed in Maharashtra after a message identified them as alleged child kidnappers in the village of Rainpada. It motivated a mob to attack and murder them in broad daylight. The messages had been spread by people manually forwarding the original message through many, relatively small, Whatsapp groups. Following similar attacks, Whatsapp limited the ability to forward messages in the app to no more than 5 groups in India and 20 in the rest of the world.
On April 7, Whatsapp announced that they would impose even stricter limits to stave the spread of conspiracy theories related to the coronavirus. Users can now not forward a message to more than one group at the time world wide.
On Whatsapp, users have a different form of relationship to that of a subscriber and producer, such as on YouTube, although every platform allows two-way communication through direct messages and comments. This relationship is fundamentally unequal in that a creator with a large number of followers might receive hundreds or thousands of messages from subscribers. On Whatsapp, most users already know each other and generally trust each other to a higher degree which makes their influence more potent.
Unfortunately, it is incredibly hard to counter conspiracy theories online. Issues relating to the way platforms function and what content they recommend intersect with issues of tech literacy and issues in the offline world. While many major platforms have to some degree attempted to respond to false information and conspiracy theories, the structure of the platforms continues to promote the spread of information rather than the quality of it, which contributes to an environment where the difference between true and false is increasingly hard to determine.
Tech literacy is therefore also a relevant issue. We are often naive on platforms that we do not completely understand and we expect it to work similarly to in the real world, although it often does not. It is harder to know the intention of someone online. The effect is that it is easy to share something that is false and it makes it difficult to determine the right approach to argue against ideas online, as people might sincerely believe what they share, or might do it simply to upset and cause outrage. Without knowledge of intent, a response could mistakenly serve to entrench the idea further, or give it unnecessary attention.