Australasian Science: Australia's authority on science since 1938

Bursting the News Filter Bubble

Credit: niroworld/Adobe

Credit: niroworld/Adobe

By Simon Knight

Online technologies can create echo chambers that reinforce our world views, but does this necessarily mean we need to open ourselves up to alternative facts?

The full text of this article can be purchased from Informit.

After the US presidential elections, Google searches for Breitbart news peaked as people, many of whom weren’t Donald Trump supporters, took to the right wing website to try and understand the views they were espousing.

Since then there have been frequent calls for more of us to step out of our social media echo chambers and to “burst the filter bubble”created when social media feeds and search engine personalisation emphasises content similar to content you have viewed or liked before, creating echo chambers that reinforce rather than challenge particular views. So, your Facebook feed only exposes you to views you already agree with, and to information that supports those views, leading to a general deterioration in public and political debate as we seem unable or unwilling to engage with different perspectives.

If we believe this argument, then Facebook presents an information-access issue that insulates users from diverse perspectives that would improve political discourse. But it’s hard to conduct empirical research to see if this is actually the case because companies control their data, users typically don’t state their politics explicitly, and the impact of proprietary algorithms can only be guessed at.

The research that has been conducted – mostly in the US – paints a complex picture of the role of technology in reinforcing cognitive bias....

The full text of this article can be purchased from Informit.