Dark side of the moon: Why lunar landing conspiracies flourish online

NASA

This article is part of Apollo: A Lunar Legacy, a multi-part series that explores the technological advances behind Apollo 11, their influence on modern day, and what’s next for the moon.

For anyone who has taken an American history class, the moon landing was one of those pivotal chapters that defined the story of the nation, and even the world. It was, as Neil Armstrong described it in the moment, “one giant leap for mankind” — and according to some people, it never happened.

The idea that the moon landing was a hoax got its first (slight) brush with the mainstream in 1976 when Bill Kaysing, a former technical writer at Rocketdyne, self-published a book titled We Never Went to the Moon. Perhaps because the Watergate scandal left America distrustful of the government, and perhaps because Kaysing’s time with Rocketdyne (which supplied NASA with engines for the Saturn rocket), the book struck a chord with a small but fervent slice of the population (a 2001 Gallup poll estimated that 6 percent of Americans doubted the authenticity of the lunar landing).

Kaysing’s evidence is spurious, at best. He includes a picture from the landing with the pitch-black void in the background, a caption asking “Stars? Where are the STARS?” As astronomer Phil Plait explained in a thorough takedown of moon landing theories, the cameras from the lunar landing, set to a fast exposure due to the bright lighting of the moon’s landing, wouldn’t capture the faint light of distant stars. In a 1994 Wired feature on the then current state of moon landing conspiracy theories, Oscar winning special effects guru Dennis Muren dissects the claim that the landing was filmed in a studio; Kaysing responded “Perhaps this guy [Muren] was part of the cover-up. Anything is possible.” Anything, it seems, except men walking on the moon.

50 years after the event, conspiracy theories regarding the moon landing persist, and conspiracy theories in general are thriving. Pro athletes like Kyrie Irving and Steph Curry have entertained the idea that the Earth might be flat or the moon landing was staged. A theory that a pizza parlor was actually a front for a child sex-slave facility led to an armed man entering the restaurant and demanding the release of the children. Conspiracy theories that vaccines cause health problems have actually spurred a decline in vaccinations and a resurgence of measles.

I’ve been getting a lot of bs for this. But I don’t believe we (USA) have landed on the moon. There is a lot of evidence that points towards a fake moon landing

— youngjerry (@_young_jerry) July 16, 2019

How is it possible, in an age where so many people have unfettered access to so much information, that these ideas continue to spread? While the internet may offer the ability to access information, the particular nature of social media allows misguided conspiracy theories to spread.

Social media has lowered the barrier to entry for content

Message boards and forums have been safe havens for esoteric communities, including conspiracy theorists, for as long as the internet has been around. In the ‘90s, one imagines that PizzaGate may have been limited to a dingy Geocities page buried in the dark corners of the web. Recently, however, these theories have spread like a plague across the massive social media platforms — Facebook, Twitter, Youtube, etc. — that now dominate online life.

If you need proof that these platforms have been the vectors for the contagion, look no further than their own confessions. Facebook, having taken criticism for its role in the spread of “Fake news” during the 2016 presidential election, has made public efforts to combat misinformation on the site. YouTube has long been a platform for conspiracy theorists to post videos pushing their views, and the site’s algorithms helped circulate these videos to general users in the form of recommendations, and the company recently vowed to stem the tide of blatantly false information.

What is it about social media that makes it such fertile ground for conspiracies to grow? First, they provide a low barrier to entry for anyone who wants to disseminate content. When Bill Kaysing wanted to put his ideas forth to the public, he had to self-publish them, without the aid of a publishing house to handle production, distribution, and marketing. Nowadays, anyone with an internet connection can upload a video to YouTube. In 2019, if you want to make your case that the moon landing was a hoax or that the course of history is directed by a giant, sentient crystal, it’s simple to upload a video and put it in front of thousands or even millions of people.

Content creators might produce conspiracy theories, but the consumers are just as important to spreading it.

That said, while content creators might produce conspiracy theories, the consumers of that content are just as important in spreading it. If I were to buy a copy of Kaysings’ book in the ‘70s, I would have just that: One copy. I could loan it to friends, or maybe buy ten more copies to gift at a Christmas party — probably to pained grins — but ultimately my reach would be limited.

Today, if I read an article about how vaccines cause autism and I’m inclined to believe it, I could share the article online, spreading it out amongst my social media following. Users on Facebook and Twitter who find themselves enchanted by a conspiratorial work can distribute it to everyone they know, and all of those people can distribute it to everyone they know, a viral transmission with no input or oversight. Social media cut gatekeepers out of the content business, without considering whether barbarians might show up at the gates.

Social media encourages group identity … and group conflict

The beautiful promise of social media platforms is that they bring people together. Former classmates or family members can share the daily details of their lives, whether miles or continents apart. People who feel intellectually or culturally isolated where they live can seek out like-minded individuals all over the world. Emotional bonding and intellectual exchange were the vision of the Facebook era, and for a while in the early 2010s, which seemed to be a good thing. When the Arab Spring was in full bloom, for example, commentators lauded the power of social media in helping protestors mobilize.

But just as social media helped the youth of authoritarian countries share ideas and organize protests, it has also been a boon for the conspiracy-inclined to find fellow believers. The mechanisms of social media don’t simply help to find like-minded and share ideas with them. They encourage people to support ideas they already agree with and condemn those they don’t.

Most major social media platforms provide users with means to commend or condemn content. The purest expression of this is Reddit’s upvotes and downvotes; with just a click of an arrow pointing up or down, a user can express their agreement or disagreement with a post, and comments that receive a significant number of downvotes might even be “hidden” from view. Facebook allows users to react to posts, and while there is no “thumbs down” available, the “angry face” can fill that niche.

Other platforms like Twitter and Instagram might limit the available reactions to “likes,” but the ultimate consequence of these systems is a binary response to content, either approving or disapproving. Even though Twitter does not include a button for disliking tweets, the Twitter community has developed its own ways to signal disapproval: “Yikes,” “Oof,” and “Sir, this is a Wendy’s” are among the stock phrases used to quickly signal disapproval of a post.

As a result, nuanced discussion on social media is a rarity; rigid ideological rallying is the coin of the realm, and that’s the type of environment that conspiracy theorists thrive in, because they require enemies. Conspiracies aren’t merely secret activities, but ones carried about by nefarious powers, against whom the conspiracy theorists must be vigilant.

In one of the seminal texts on the history of conspiracy theories, The Paranoid Style in American Politics, historian Richard Hofstadter notes that:

The paranoid spokesman sees the fate of conspiracy in apocalyptic terms—he traffics in the birth and death of whole worlds, whole political orders, whole systems of human values. He is always manning the barricades of civilization. He constantly lives at a turning point. Like religious millennialists he expresses the anxiety of those who are living through the last days and he is sometimes disposed to set a date for the apocalypse.

Social media is a battlefield, and that setting suits conspiracy theories just fine. To see this in action, one need only browse some conspiracy communities online. In a particular anti-vaccination Facebook group (with 36,000 members), the posts don’t merely claim that vaccines are ineffective or even unsafe, but accuse vaccines of being a eugenics program run by transhumanists pursuing immortality. The “vaccine industry” works not only through “propaganda and censorship” but also through “coercion and the threat of violence.” Posts expressed fear that police or military officials might come to peoples’ houses to force vaccinations on them. The news that Facebook would crack down on miracle cures and vaccine misinformation was greeted with was met with cries of censorship.

Reddit has been a popular site for conspiracy communities. The general /r/conspiracy subreddit currently has 895,000 subscribers, but more specific communities like /r/MandelaEffect and /r/Retconned (both focus on the Mandela Effect), manage 137,000 and 32,000 members, respectively. The site is also home to some of the more fervent conspiracy rhetoric: There are entire subreddits, such as /r/WatchRedditDie and /r/SubredditCancer, dedicated to the idea that Reddit’s admins or other insidious powers are manipulating the site to advance agendas. When a site gets quarantined or banned for violating Reddit’s site-wide policies, it’s seen as evidence of a larger plan to stifle certain views.

The genie is out of the bottle

Social media platforms are now making an effort to regulate the spread of misinformation on their sites, but even if they tweak algorithms to keep conspiracy theories from reaching mainstream users, the enclaves of conspiracy thinking are already established. Society is in a strange age where reality is difficult to agree on, and it’s only going to get stranger. Machine learning is making it easier to create convincing, fake videos (deepfakes) that could erode people’s trust in reality even more. In this future of unreality, people will seek something to latch on to, and conspiracy theories will offer community, and a comforting sense of order in a time of chaos.

Editors’ Recommendations

  • Facebook cracks down on fraudulent miracle cures, vaccine misinformation
  • Reddit is finally back online after an hours-long desktop outage
  • Help wanted: British royal family seeks social media wiz to run its accounts
  • House antitrust investigation targets Facebook, Google, and other tech giants
  • A new Senate bill would fundamentally change the internet as we know it