Since the internet’s advent, conspiracy theories have acquired followings online. Now, in the era of social media, people use platforms like Facebook, Twitter and YouTube to spread disinformation and misinformation. Instagram, the Facebook-owned image platform where influencers tout luxury, beauty and consumer culture, has also become an online home for conspiracies. And lately, one has been particularly prolific: QAnon.
Researchers have seen a significant uptick in online discussions related to QAnon — a far-right conspiracy theory — across multiple digital platforms this year. In March, membership within QAnon Facebook groups increased by 120 percent; one journalist tracking the impact of QAnon found that between January and August, the biggest Instagram “Q” accounts “generated 63 million interactions and 133 million video views.”
The platforms are now taking action. Facebook, which acquired Instagram for $1 billion in 2012, cracked down on October 6, saying, “we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content.” On October 15, YouTube announced in a blog post that it would “prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.” TikTok —which has 100 million monthly active American users — as well as several smaller platforms, including Pinterest, Etsy, Peloton and Triller, have also taken a stance against QAnon.
Still, conspiracies and disinformation have a way of finding new audiences. In recent months, QAnon content has become more prevalent on Instagram, a platform used predominantly by women, who appear to be the new target of this misinformation, according to experts.
What is disinformation and how is it different from misinformation?
Misinformation is the inadvertent sharing of false information. Disinformation is the deliberate creation and sharing of information known to be false.
Claire Wardle — the founder of First Draft, a nonprofit focused on research and practice to address mis- and disinformation — identified seven distinct types of problematic content in the information ecosystem. Each type sits on a scale that loosely measures the intent to deceive, from satire or parody to intentionally fabricated content.
The deep anxiety and economic insecurity catalyzed by the coronavirus pandemic coupled with the unrest that resulted from the summer’s protests left people especially vulnerable to disinformation. Conspiracy theorists try to exploit vulnerabilities, according to Rich Hanley, an associate journalism professor at Quinnipiac University.
“In a world that doesn’t make sense, if you can create a narrative that seems to be credible, goes to people’s emotional core and has a kernel of truth within it, then that narrative is going to take off,” said Hanley, who studies fake news dissemination and social media bots.
What is QAnon?
QAnon is a “wide-ranging conspiracy theory that claims that an elite group of child-trafficking pedophiles have been ruling the world for a number of decades and that President Donald Trump has a secret plan in place to bring this group to justice,” according to a 2020 report published by the Institute for Strategic Dialogue, a London-based think tank that tracks online extremism.
An infamous conspiracy theory that is considered a predecessor to QAnon was Pizzagate, which falsely claimed a pizza restaurant in Washington, D.C., secretely harbored a pedophilia ring. In 2016, a man who believed the conspiracy theory was real went to the restaurant with an AR-15.
QAnon started in October 2017 when someone identified as “Q” began posting a series of cryptic messages on 4chan, an anonymous online forum used by mostly young men. Q claimed to have high-level security clearance and information about Trump’s ongoing battle against “the deep state,” the Institute for Strategic Dialogue report said. One or more users claiming to be Q have continued to post thousands of times over the next few years on the site, which has been known to be a hotbed of hoaxes.
You probably have a neighbor that is QAnon, but not all your neighbors are QAnon.
Researchers observed a significant increase in online QAnon discussions in 2020 amid the coronavirus pandemic, the nationwide Black Lives Matter protests and the run-up to the presidential election. The theory now includes anti-vaccine, anti-5G, antisemitic and anti-migrant rhetoric, all in addition to the core belief that a consortium of elites are supposedly trafficking and sacrificing children.
“It’s kind of a choose your own adventure,” said Melissa Ryan, who helps policymakers and institutions combat extremism. “Whatever beliefs you’re inclined to believe, someone at QAnon will tell you that’s right.”
From October 2017 to June 2020, the Institute for Strategic Dialogue recorded more than 69 million tweets, 487,000 Facebook posts and 281,000 Instagram posts mentioning QAnon-related hashtags and phrases.
“I would suspect that a lot of right-wing content is over-amplified on how many people believe it,” Ryan said. “The radicalization potential is still quite high. I would say you probably have a neighbor that is QAnon, but not all your neighbors are QAnon.”
Are conspiracy theories new?
Conspiracy theorists have always existed, but it used to take disinformation much longer to spread, Hanley said.
“If you were a Soviet intelligence officer during the Cold War and you wanted to spread misinformation about AIDS and blame the U.S. for it, you basically had to start a fake newspaper in some out-of-the-way country, publish a story about American scientists working for the Department of Defense, and wait two to three years for that fake story to wind its way to Europe and then the United States,” Hanley said. “Now, you can do it virtually in seconds.”
The theory that an elite group of people are plotting to control the world by drinking the blood of children has roots in antisemitic conspiracy theories that date back hundreds of years, Hanley said. An iteration of that narrative was also told during the 19th century, when Native Americans were said to be kidnapping White children.
“Today’s version spreads so efficiently and effortlessly,” Hanley said. “And it’s believed by more people than was physically possible before.”
Why is disinformation spreading on Instagram?
A 2019 report published from the New York University Stern Center for Business and Human Rights predicted that Instagram could be a magnet for disinformation and misinformation in 2020 — more so than Facebook, Twitter or YouTube.
Instagram, which is most popular with 18 to 29-year-olds, played “a much bigger role in Russia’s 2016 election manipulation than most people realize,” the report said. There are more than one billion active users currently on the image-based platform.
Hanley said more and more Instagram users, more than half of which are women, are spreading QAnon conspiracies — many unintentionally.
It appears that several conspiracies migrated from 4chan to Instagram via YouTube beauty and wellness influencers who used to share content about magic formulas, oils and crystals.
Then, earlier this summer, QAnon gained additional followers after it hijacked #SaveTheChildren, a hashtag related to real anti-human-trafficking causes. This tactic of co-opting the hashtag helped amplify and expose Q messaging to American moms.
The QAnon stuff infiltrated Instagram and seeped into the suburban consciousness of American women.
A lot of the content isn’t explicitly QAnon, but is connected to the conspiracy when traced back to its source, Hanley added.
“The QAnon stuff infiltrated Instagram and seeped into the suburban consciousness of American women to a certain extent, and they bought into it,” Hanley said. “The key is the children.”
“Human trafficking is an issue, and one that needs to be taken seriously,” Hanley said. “What QAnon is doing is riding on that reality to create an alternate reality where there’s an elite group of wealthy globalists who are trafficking in the blood of children. There is no evidence, but there’s enough truth in the stories: the vulnerability of the young.”
Haiyan Jia, an assistant journalism professor at Lehigh University, said visual content — Instagram’s primary method of communication — tends to have a stronger, more emotional appeal. People who make judgments based on their emotions tend to do so to the detriment of logic and rationality.
It’s not just Instagram, though. QAnon theories can be found across many platforms, including more niche ones like Nextdoor and Peloton. If there’s a social media application, there will be disinformation associated with it, Hanley said.
Information on social media is transmitted through social circles. It’s not always clear where posts originated from, complicating their credibility, Jia said.
“We rely on our friends and family to make the judgment instead of analyzing the actual information,” said Jia, who researches the psychological and social effects of communication technology. “If you trust the person who shared, you are more likely to trust the information.”
What does this mean for the 2020 election?
Less than 50 days before the election, the Federal Bureau of Investigation and the Cybersecurity and Infrastructure Security Agency issued an announcement warning that “foreign actors and cybercriminals” might attempt to spread disinformation on social media to “discredit the electoral process and undermine confidence in U.S. democratic institutions.” In March, the FBI had characterized conspiracy theory driven extremists as domestic terrorists, specifically citing QAnon followers.
Helen Lee Bouygues, a fake news expert and founder of the Reboot Foundation, said the general population gets about 90 percent of their news from social media. The more that people use social media, the worse their news judgment becomes, according to her study published last month. This was true across age, education and political ideology. The majority of those surveyed expressed confidence in their ability to identify fake news, and yet only 1 percent of the 1000 participants used actual fact-checking techniques.
Hanley said the opportunity for election manipulation has only increased since the last presidential race.
“The 2016 election was like the junior varsity game, but 2020 is the varsity game,” Hanley said. “The players have taken the field and the number of people who either create or disseminate disinformation is expanding wildly.”
Nearly 90 percent of all Q-related hashtags between October 2017 and October 2019 originated from the United States, according to the ISD report. Dozens of political candidates linked to QAnon were running in 2020 for Congress, state legislatures and local races.
When asked to disavow QAnon in its entirety at a town hall on October 15, Trump refused. The president has never explicitly said he believes in the conspiracy theory, but he has praised QAnon supporters in the past, saying he heard they “love our country” and “are very strongly against pedophilia.” He has also amplified QAnon-promoting accounts on Twitter, and QAnon merchandise, symbols and slogans can often be found at his rallies.
“There is a convergence between QAnon, the #SaveTheChildren hashtag and the president’s supporters,” Hanley said. “It is clearly something that the Democrats need to be aware of and figure out how to neutralize, which is very hard to do.”
When it comes to disinformation on Instagram, it seems like the target increasingly is women, according to Alexandra Middlewood, an assistant professor at Wichita State University.
Suburban women are a growing part of the electorate and, in recent presidential elections, their support is critical for electoral success. The Trump and Biden campaigns know that White suburban women are more likely to change their minds about the president than their male counterparts. A shift of several percentage points among them from 2016 to 2020 could decide whether Trump is re-elected.
“What we have seen most often, particularly since 2016, is the use of dis- and misinformation that follows a narrative certain groups already believe,” Middlewood said.
A voter who is presented with information that already matches her political attitudes, for example, will be more convinced that this information is true — whether it is or not. Middlewood said research has shown that many people share content on their social media accounts “to strengthen their social identity,” not communicating whether something is true or false.
The spread of disinformation isn’t likely to sway undecided voters, but instead used to reinforce beliefs held among members of the same political party. (Partisanship is the strongest predictor of vote choice, Middlewood said.)
“The disinformation about voter fraud in particular is seemingly setting the stage for the president to challenge electoral results in November should he lose reelection,” Middlewood said. “And he will have a ready and willing base of support behind him should he do so.”
What can be done?
In early October, Facebook announced that it would remove any Facebook pages, groups and Instagram accounts representing QAnon — whether or not they contain violent content.
“We began directing people to credible child safety resources when they search for certain child safety hashtags last week — and we continue to work with external experts to address QAnon supporters using the issue of child safety to recruit and organize,” the company said.
Despite the platform’s efforts, 64 percent of Americans believe Facebook is at least somewhat responsible for spreading disinformation and 46 percent think Instagram is at least somewhat responsible for spreading disinformation, according to the NYU study.
Bouygues said the responsibility also falls on the government and the people to learn better media literacy, use critical thinking and fact-check information more thoroughly. The government should sponsor this type of education in schools and finance more institutional media sources, she said.
First Draft provides resources, including checklists and recommended tips, to help journalists and anyone who is interested in verifying information. According to the nonprofit, there are five questions that should be answered when considering online content: Are you looking at the original piece of content, account or article? Who captured it? When was it captured? Where was it captured? Why was it captured?
“We need to be self-aware that we succumb to fake news,” Bouygues said.
Bouygues, whose study found that people could become better at identifying fake news by reading an article about how to decipher legitimate versus fake news, was more optimistic people could learn to combat disinformation.
“We just need constant reminders,” Bouygues said. “It’s like learning a second language.”