We the people — we the conspiracy disinformation campaign.
Note: See a mistake or inaccuracy? Let me know!
Definitions
What are conspiracies?
According to Britannica, a Conspiracy Theory is an attempt to explain harmful or tragic events as a result of the actions of a small, powerful group. Such explanations reject the accepted narrative surrounding those events; indeed, the official version may be seen as further proof of the conspiracy.
Cambridge Dictionary defines a conspiracy theorist as someone who believes in a conspiracy theory, the idea that an event or situation is the result of a secret plan made by powerful people.
The more digestible Urban Dictionary definition notes Conspiracy Theorist to be “a person who believes that he holds the ‘absolute truth’ and all those around who disagree with him are either naive, oblivious to the ‘truth’ or just bought off by the clandestine organizations who pull the ‘strings of humanity’ and manipulate events in order to achieve world domination…supreme majority of the conspiracists believe many things that are pure fiction and are more than willing to eat up everything that fits their story, even when that’s debunked.”
As they’ve been around for centuries, when we hear of these theories, we most often think of flat-earthers or moon landing non-believers. For those disconnecting themselves from this sort of thinking, the theories seem funny, absurd and even entertaining. Those types of conspiracies never really have any direct impact on our society and our overall well-being. But, over the past year, we noticed a drastic shift. Today’s conspiracies are causing more and worse disturbances and even violence offline.
Algorithms of conspiracy-disinformation — How they spread and why ‘the big guys of tech aren’t solely to blame
Since the beginning of 2020, the conspiracies grew into, what I call, micro-conspiracies. If only the micro reflected their size. More accurately, the micro reflects the targeted, local approach of these theories that, when spread across communities, no longer become theories. They are full-fledged news headlines, spokesperson quotes, analytical articles, broadcast conversations… they are the disinformation pieces we’ve been talking about for years. Only, now they are detailed and catered to each individual through our personalized algorithms — a machine that, although doesn’t have a mind of its own, is made to “solve problems”. A problem is solved when we engage with the proposed solution, regardless of the solution being accurate or not.
Example: problem — you engage with Google’s algorithm by looking up “fall boots” and then suddenly notice your Instagram algorithm begins feeding you paid ads for all sorts of boots. You then clarify that it’s boots from Steve Madden by liking one of their posts and so, those show up. You make a purchase (regardless if it’s through an ad or directly from a site) and voila — a very simplified version of how the problem gets solved. You’ve engaged with the solution and the algorithms move on to the next problem. Now, apply this problem-solving method to information management.
You get frustrated with the current state of your city and Google “is COVID-19 real”. The problem is introduced and you are put into Google’s category of COVID-information seekers. Most likely, you subconsciously will start looking for information that reinforces your suspicions of the fact that the virus is or is not real and once you find that piece (whether it’s a Reddit thread, or an article). The algorithms make note of the source you’ve engaged with (putting you into categories of conservative, liberal, leftist, right-leaning, etc), and begins to feed you similar pieces to continue “solving your problem”. Suddenly the next time you YouTube “COVID-19”, all the top National Geographic videos of how viruses come to be on your YouTube page are replaced with “what are they hiding” vloggers.
To their credit, the big guys running digital platforms have implemented various efforts, strategies and tools within their organizations and products to slow down the spread of disinformation and misinformation. However, that does not negate our responsibility as users for the type of content that is alive and thriving online.
Cookie-cutter of conspiracy’s development — I don’t follow conspiracies, why should I care?
An algorithm can’t fact check a source. It doesn’t feed us fake news on purpose. The algorithm is like a loyal dog that follows our every move and adjusts to what it knows — what are the keywords and who has a similar digital profile to the people already engaging with it. So if we continue to engage with conspiracies even just for fun, our online bubble quickly becomes highly catered to these interests.
From Freemasons or Illuminati-like QAnon to pizzagate, to the now (not new but) dominating agenda of anti-vaxxers, the path to how even the “most intelligent” find themselves catching their breath when realizing they’ve fallen for a piece of disinformation is cookie-cutter clear (as outlined by AJ Willingham’s article How the pandemic and politics gave us a golden age of conspiracy theories):
1. It starts with a crisis
Pick one, 2020 alone has been an active year.
2. Doubts grow
Disinformation campaigns run rampant during crisis times. As an example, communities across Ontario are growing more confused and frustrated as the provincial government is beginning to double down on social interaction restrictions while continuing to keep public businesses open. Telling people to stay apart and not visit their own families while still being able to visit restaurants and casinos seeds further doubt about the seriousness of COVID-19.
3. An easier explanation arises
In a (becoming more) complex world, we’re all tired and want easy quick, bite-sized explanations. Together with the shortening attention span, headlines have become our primary news sources and details instil doubt. Our response to anything that seems too confusing or requires too much time is automatic — no thank you. The public reserves to the likes of Twitter feeds for their news updates.
4. The believers are connected
Hello social media, my old friend. It’s easier than ever to surround ourselves with those who think like us. If they don’t agree, the “unfollow/unfriend” button is very easy to find. If only this remained within the parameters of (very useful) mommy groups on Facebook, but we are now seeing these online interactions spill over offline more often with real-life consequences.
5. Their beliefs are reinforced
Speaking of filter bubbles — today, they are stronger than before and are only reinforced by those trying to burst them. In a climate where truth seems elusive, the rule of confirming if the information we found checks out with other sources has backfired. The “other sources” remain within our bubble and are likely to reinforce the original story while making any external opinion or facts — off-putting.
6. The theories adapt
As long as core points hold up, the theories are nimble and function to always prove their believers right. In today’s world, admitting that we’re wrong about something has become the ultimate badge of ignorance (spoiler — it’s not, it’s okay to admit you’ve learned more and changed your mind).
7. Damage is done
We are currently in the depths of a global debate — is he or is he not actually sick? My take — it doesn’t matter. What matters is the end message and the headline that will spread across blogs and major online accounts. Will it be that COVID-19 is real but not that big of an issue (see Brazilian President Jair Bolsonaro)? Or, will it be that COVID-19 is real and a serious (inter)national crisis (see British Prime Minister Boris Johnson)?
To wrap it up…Is there a solution?
The world of conspiracy-led disinformation is no longer a campaign driven solely from overseas. My once hopeful thinking that we will be able to find a strategy to combat disinformation and misinformation sources is beginning to vanish. Although the continuous funding of propaganda that can be traced back to a single source hasn’t gone away, the roots of the messaging have seeped so deep through our systems and digital platforms that we ourselves have now become the sources of this chaos.
It’s now all about how we react —
- do we continue to dismiss conspiracy-driven disinformation and misinformation as silly and ridiculous?
- do we get defensive when confronted with information that might seem too complex and ever-changing?
OR,
- do we take the extra steps to get educated and help prevent others from going down the same route?
My goal isn’t to keep anyone up at night over this, but I do hope that by becoming more aware of the reality we’re living in, we can all avoid becoming super-spreaders of disinformation and misinformation, and slow the spreading mistrust of credible sources.