MY KOLKATA EDUGRAPH
ADVERTISEMENT
Regular-article-logo Tuesday, 23 April 2024

Echo? Step out of the chamber

The echo chamber is harmful to the public’s awareness of important issues

Surit Doss Published 23.07.19, 05:19 PM
The cause of the echo effect is the machine learning algorithms used by these social media companies to serve their users content that is tailored for their interests.

The cause of the echo effect is the machine learning algorithms used by these social media companies to serve their users content that is tailored for their interests.

When the results of the recent general elections came in, some people, including those in the media, were shocked. Some of them expected the Narendra Modi government to come back to power but with fewer seats; some even predicted a landslide in favour of the Congress. But how could their understanding of the situation be so wrong? How could reports from the field be so misleading? How could even the urban class not see beyond the propaganda that they were being fed?

The answer to that is a phenomenon called the echo chamber effect. It creates an information bubble around a user so that he or she is only exposed to articles or information that supports their previously held views. This information bubble, or filter bubble, is curated for you by the algorithms of social media companies, Facebook, Google, WhatsApp, LinkedIn, and Twitter, from data acquired from your search history, your past click behaviour, your location and also what type of computer you use. The term, Filter Bubble, was coined by Eli Pariser who has written a book on the subject.

ADVERTISEMENT

All you have to do is click “like” for some news item on Facebook or search for something specific on Google, or follow a definite kind of people on Twitter and these social media giants will see to it that you are constantly shown content that will reinforce your current political or social views without challenging you to think differently by exposing you to varied facts and information.

The cause of the echo effect is the machine learning algorithms used by these social media companies to serve their users content that is tailored for their interests. Personalised algorithms sound good on the surface because users can find their information without wading through what’s happening in Syria, China, North Korea, South Africa or Kim Kardashian and then finally arrive at what is happening in their own country. As Mark Zuckerberg said, “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” Additionally, these algorithms allow companies to send users advertisements that suit their preferences, thereby ensuring their revenue income.

The downside is that these algorithms effectively filter out results that do not support a user’s current viewpoint. This was demonstrated by Eli Pariser in his TED talk. He showed how people with different political views who were searching for the same news at the same time are shown different results by the same search engine.

I have noticed that people who are generally anti-incumbent have suddenly become more aggressive and dogmatic in their views. The echo chamber is harmful to the public’s awareness of important issues because they are not exposed to diverse expert opinions and only see and hear information that substantiates their beliefs. A lot of the new media industry is built out of social “share” and the easiest way to win that share is to confirm and vindicate your views. There is a conflation of opinion and facts and this is resulting in extreme polarisation in society. There is a boom in the number of sites that push biased and sensational news. They know you will like it and share it.

So where do people turn for their news, given that even news sites are all trying to personalise their news in different ways? Political debates on TV have lost credibility with anchors or hosts yelling and cutting off opinions that don’t agree with them and don’t offer any insights. More than 500 million Indians now have daily access to mobile phone and the Internet — even the rural and poorer population. Over 61per cent rely on WhatsApp, YouTube, Facebook, Twitter, Quora, or Google for their news on the go. It is obvious where the polarisation is coming from.

It is possible to come out of the echo chamber. Adjust your newsfeed preferences constantly and select a wider variety of content. Follow a different set of voices. Be warned that a plethora of fake news is taking over our Facebook. Check out their veracity on Factly (https://factly.in), Alt News (www.altnews.in), Boom (www.boomlive.in), Snopes.com or Politifact.com or FactChecker.in.

Follow us on:
ADVERTISEMENT