On the surface, it seems like social media has the boundless potential to expand our world, connecting us to ideas and people we otherwise would never have found. However, a new study claims just the opposite: Social media actually isolates us, creating and facilitating confirmation biases and echo chambers where old — and sometimes erroneous — information is just regurgitated over and over again.
If it sounds bleak, it’s because it kind of is.
The findings were published in the Proceedings of the National Academy of Sciences. Using data modeling, a team of researchers from Italy mapped the spread of two types of content: conspiracy theories and scientific information.
“Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest. In particular, we show that social homogeneity is the primary driver of content diffusion, and one frequent result is the formation of homogeneous, polarized clusters,” the paper concludes.
In other words, you and all of your friends are all sharing the same stuff, even if it’s bunk, because you think alike and your tightly-defined exchange of ideas doesn’t allow for anything new or challenging to flow in.
What this means for “fake news”
Alessandro Bessi, a postdoctoral researcher with the Information Science Institute at the University of Southern California, co-authored the paper. He says the point of the study was really to investigate how and why misinformation spreads online.
He says the team got interested in the phenomenon after the World Economic Forum listed massive digital misinformation as one of the main threats to modern society.
“Our analysis showed that two well-shaped, highly segregated, and mostly non-interacting communities exist around scientific and conspiracy-like topics,” Bessi told CNN. “Users show a tendency to search for, interpret, and recall information that confirm their pre-existing beliefs.” This is called “confirmation bias,” and Bessi says it’s actually one of the main motivations for sharing content.
So instead of sharing to challenge or inform, social media users are more likely to share an idea already commonly accepted in their social groups for the purpose of reinforcement or agreement. This means misinformation — which is a much more appropriate term for “fake news” — can rattle around unchecked.
“Indeed, we found that conspiracy-like claims spread only inside the echo chambers of users that usually support alternative sources of information and distrust official and mainstream news,” Bessi says.
What can we do about it?
Even if you pride yourself on avoiding misinformation and think you’re having open, accepting conversations online, Bessi cautions that we’re all subject to confirmation bias on some level.
“If we see something that confirms our ideas, we are prone to like and share it. Moreover, we have limited cognitive resources, limited attention, and a limited amount of time.”
This can lead to reckless sharing — we sometimes share something without really examining what it is.
“For example, I may share a content just because it has been published by a friend that I trust and whose opinions are close to mine,” Bessi says.
In the future, Bessi says, there may be programs or algorithms that can help clean up misinformation. For now, he recommends a more analog approach: Do your own fact-checking — and soul-searching — before you share.