Use your keyboard arrows
to view the story!
Growth.Design Case Study #024
Story Duration: 4 min
Remember to use your keyboard arrows to navigate
Facebook is a polarized platform.
This became even more obvious with the big events of the last few years (elections, COVID-19, etc).
In fact—
Facebook's misinformation has TRIPLED since 2016.
Remember to use your keyboard arrows to navigate
There are five main reasons for this.
We call them—
—The Five A's of Misinformation:
1. 🤖 Algorithm
2. 👁 Availability
3. ✋ Ability
4. 📣 Amplification
5. 🔥 Ambiguity
Let's start with the first "A"…
Facebook is optimized for
high-engagement. But—
—since we're more likely to interact with content that triggers strong emotions like fear, disgust and anger…
…feed algorithms rank those posts higher (even if some of those stories are false).
Now, when we're emotional, "critical thinking" isn't our brain's top priority.
So while we explore our feeds… {swipe}
A recent study from MIT showed that false news travel 600% faster than true stories on social media.1
1MIT, Study on the spread of false news (2018)…Facebook shows us shocking news that support our political views.
That's because Facebook knows our brain loves information that confirms what we already believe.
People tend to search for, interpret, prefer, and recall information in a way that reinforces their personal beliefs.1
Since Facebook optimizes for what users want to see, it creates what experts call "filter bubbles".2
1Growth.Design, List of Cognitive Biases (2020)Plus, the feed is optimized for interaction.
That makes it easy to amplify any message with a simple tap…
Every action has a consequence. And each consequence has another consequence (sometimes undesirable).
These are called Second-Order Effects.1
For example:
…so much so that 59% of links shared on social media are shared without ever being read.
It's called
"🙈 Blind Sharing".
The growth rate of trends increases in proportion to the number of other people who have already adopted them.1
1Psychology Today, Why We Are So Easily ManipulatedAnd the more people share that content…
—the more likely other people are to share it as well.
Oh—and you might think that this "blood-drinking" ad example is ridiculous, right?
Well, independent fact-checkers reviewed 150 million Facebook posts so far, but…
…Facebook decided that they should not fact-check politicians' ads.
In other words, that weird ad would be tolerated.
In addition, opinion and speech from politicians is not eligible to be fact-checked.
And when it comes to content moderation, we saw several examples recently where that back-fired massively.
Trying to censor information can have the unintended consequence of further publicizing that information1… often making things worse for the censor.
For example, in Oct 2020, Twitter's censorship of the NYPost article about Biden ended up doubling its viral reach.2
1Wikipedia, Streisand Effect (2020)Censorship is a really slippery slope because you don't want Facebook to decide what's true or false.
(…and Facebook neither because they don't want to be liable)
So think about it…
…how would you minimize misinformation?
Well…
Remember the 59% of "🙈 Blind Shares"?
What if—
Facebook already tracks if you've clicked an article and how long you've read it. Why not use that data for good? (vs mainly for advertising)
A Forbes editor suggested a similar concept a few years ago.1
It has since been supported and iterated on by many, including an approach with an even higher barrier to sharing by the media literacy project ThinkFirst News.2
1Forbes, What If Facebook And Twitter Made You Read An Article Before You Could Share It? (2017)REDESIGN
—when you're about to share a post without reading it first… {tap}
…Facebook would simply encourage you to read the article?
In a world of growing misinformation, supporting the truth should be everyone's responsibility.
REDESIGN
Oh and I first thought that Facebook would never add this kind of friction because they wouldn't want to hurt their revenues. But…
…the same could've been
said of Instagram's
"You're All Caught Up":
—and Facebook implemented it anyways.
(so there's hope!)
By the way, between the time we first drafted this case study and we released it…
…Twitter released a new sharing experience designed to minimize misinformation.
It's similar to the one I showed you earlier. And in their tests…
…33% more people opened an article before retweeting it.
Oh, and one last thing for you… →
That means less blind sharing, which reduces the reach of misinformation by millions.
Small change. Big impact.
Get this cheat sheet to improve your UX:
Yes, I want this cheat sheet
As a product community, we have the responsibility to encourage behaviors that are good for our users…
…and for society. 🙏
Q1: What did you think of the "read before you share" solution?
Q2: How would you improve it?
We're just one reply away on Twitter.
PS: Yes, we reply to everyone.
You completed Growth.Design's Case Study #024:
"The Psychology of Misinformation on Facebook"