In 2016, during the run-up to the US Presidential election, a story went viral on Facebook. It spread like wild fire, eventually accumulating over 900,000 shares. It became one of the most shared political news story of the year. The story was the news that Pope Francis had endorsed Donald Trump for president.
The problem with this story was that it was false. Fake. Incorrect. A lie. But this disconnection from the truth did not stop it from getting more social media uptake, by an order of magnitude, than any story posted by the New York Times in the same year.
Trump went on to win the 2016 election by a margin of about 70,000 votes across three swing states. Now ask, how many people were both exposed to this story and lived in those three states?
A 2017 study by Buzzfeed News found that, during the last months of the election, the total Facebook engagement for the top 20 fake news stories was higher than the total engagement for verified news, weighing in a 8.7 million shares, reactions, and comments.
How did we get here? Let’s take a trip down memory lane.
Stalkerbook
I remember where I was when the world changed. I was a senior at Boston University. As a Boston area school, we were one of the first communities able to make an account on this up and coming social network. I walked down into the basement of the IT building to sit at my desk in the Personal Computing Support Center, and the place was abuzz about “new Facebook.” Up until that day, getting information from Facebook was a “pull” process; to see what your friend was up to, you visited their profile and looked. But that day, we logged in to find this strange new landing page called “the Feed.” Now, browsing Facebook was a “push” process; its algorithm would see your friends’ activity and actively tell you what was new. Steve liked a post. Tom uploaded a picture. Jenny is no longer in a relationship.
“Stalkerbook,” they called it derisively. It seemed to foreign and off-putting to be fed this distillation of our social network’s activity via software construct. This simple distinction is what allowed the current misinformation miasma to spread; an algorithm is putting data in front of its users, curating the experience for them. This same architectural model would be employed by Twitter, YouTube, Instagram, and pretty much every new social media platform to show up after 2010; the user experience is directed by the algorithm, not by the user.
Let us dispel the framing that social media platforms are passive observers here. It is vital for their business model that we accept this framing, but we can just as easily not. These platforms are for-profit companies, and they are making a choice.
The profitability of any surveillance-based advertising platform is proportional to its users’ “engagement.” The longer users stare at the screen, the more attractive that platform becomes for somebody looking to sell an ad. Outrage and shock drive engagement; reason and veracity do not. If the only obligation is to maximize shareholder value, this is the result you get.
We deserve, and require, better.
Causality
The manifesto of the digital age was founded in the power of connectivity. In my youth, I believed this with every fibre of my being; if humans are readily connected to one another, information can flow freely.
- Information becomes knowledge
- Knowledge becomes understanding
- Understanding becomes compassion
- Compassion begets harmony.
Apparently a large number of Silicon Valley billionaires used to feel the same way. It looks nice on paper, I’ll give it that. But here’s the rub: what happens when you start not with information, but with misinformation?
- Misinformation becomes lies
- Lies become conspiracies
- Conspiracies become hate
- Hate begets violence.
If one chain is true, so is the other. If humans are readily connected to one another by bad-faith networks, misinformation propagates faster than truth. We’re verifying this hypothesis in real time, with quantifiable data.
GIGO
Information science has a rather tautological, yet still useful adage: Garbage In, Garbage Out. The implication is simply that no system, no matter how robust, no matter how well-designed, can take corrupt input and generate valid output. You can prepare for it, create redundancy to survive it and discard it, but you cannot create gold from lead.
Charles Babbage summed it up better: “On two occasions I have been asked, ‘Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?’ … I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.”
Pew Research determined that in 2024, 54% of Americans use social media as their primary news source. These platforms are, without exception, being actively exploited to propagate misinformation at an industrial scale. Hell, the richest man in the world spent more money than a thousand Americans will see in their entire lifetimes combined, personally, to buy one of the worlds most prolific misinformation engines. How many people are going to the polls and filling out ballots having been affected by it? At what point is democracy just drinking from a poisoned well, and wondering why it is dying? We have, with this advanced technology, created a public that can be more quickly and fully misinformed than any society in human history.
Democracy is, first and foremost, a system. An increasingly significant input to this system is our media environment, comprised of all of our aggregate news sources and communication platforms. Misinformation is garbage going in. What we are seeing right now is the garbage coming out.
Far too much ink has been spilled over the significance of Trump winning the popular vote for the first time. I reject this premise. This moment is not a mandate. When a polity has been sufficiently divorced from reality, democracy is being driven by invalid input, and will generate invalid output. I am becoming increasingly convinced that we can’t get better results until we find a way to collectively rectify this situation. We can blame people for hating people who are different from them, or we can ask ourselves how they were led to believe that voting based on hate would improve their lives. We can scratch our heads when we see approval numbers for corrupt politician stay above water, or we can ask why so few people know about the corruption to begin with. We can watch in horror as we see our friends and loved ones fall down conspiracy theory rabbit holes, or we can ask who stands to gain from them doing so.
Henry David Thoreau once said, “there are a thousand hacking at the branches of evil to one who is striking at the root.” I am starting to believe this is the root our our present crisis: the procedural propagation of malicious lies across the internet. How we strike at it, I’m still working out.