ADVERTISEMENTREMOVE AD

Misinformation on Social Media: How Can We Combat It? A New Report Explains

A report by The Future of India Foundation found that the current content moderation techniques are not enough.

Updated
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large
Edited By :Kritika

Social media has democratised access to information and increased citizen participation, but mis/disinformation on these platforms has polarised the users like never before, and the consequences of it are well-known and documented.

While platforms have argued that they are working towards making online spaces safer, cleaner, and devoid of hate and misinformation, a report by Future of India Foundation (FIF) says that "the current system of content moderation is more a public relations exercise for platforms instead of being geared to stop the spread of misinformation."

The report titled 'Politics of Disinformation', was launched in New Delhi on Thursday, 5 May.

ADVERTISEMENTREMOVE AD

Speaking at the launch, Future of India Foundation's Co-Founder, Saurabh said, "While conducting the research, I found that the people in rural parts of the country, who get paid to spread fake news, are not huge in number. If we run awareness campaigns for the youth, then the situation can change."

The report – which is based on discussions with youth, analysis of evolution of content moderation, and ways to mitigate online misinformation – suggests new approaches to combat misinformation to improve the current system employed by social media platforms.

'Social Media Platforms Weaponised by Vested Interests'

Social media platforms have become the primary source of information among those who use smartphones. The only other source which continues to be more dominant than social media is TV news in regional languages.

According to the report, the key takeaway from the focus group discussions (FDG) with the youth was that social media platforms have disrupted the country's information ecosystem and allowed themselves to be weaponised by vested interests.

Ruchi Gupta, Founder, Future of India Foundation said that platforms like Facebook can't evade responsibility by calling themselves just tech platforms. They need to be held accountable.

People in the FDGs expressed the feeling of being overwhelmed with the competing narratives on social media.

Most users fall in the passive category and show little intention to fact-check the information they receive, irrespective of the participant's digital literacy.

FIF found that the content moderation techniques used by social media platforms are based on – removal of harmful content or fact-checking content.

However, these methods have substantial limitations because of the volume of harmful content and misinformation published on these social media platforms.

In addition, human interventions and machine learning tools were not effective in a country like India due to the lack of technological tools, the number of languages, and cultural references.

The report also highlights how platforms have shown "lack of urgency" while responding to the growing harm of narratives and misinformation on these platforms in India.

Referring to the storming of the US Capitol on 6 January 2020 and the subsequent deplatforming of Former US President Donald Trump, the report says we have not seen measures similar to this in other countries, including India.

However, the report states that while removing content and deplatforming 'super-spreaders of misinformation' helped in reducing misinformation to a certain extent, it didn't address the larger narrative.

ADVERTISEMENTREMOVE AD

'Politics of Disinformation'

The report argues social media platforms use "free speech" as a business model instead of a moral imperative.

Social media users are constantly exposed to new content from people they have not followed based on the engagement of the content instead of the quality. This results in the amplification of misinformation since such content gets more hits.

The report states that social media platforms' amplification of such content has made individuals and narratives mainstream, which otherwise would have remained an unknown fringe.

While initially the platforms aligned with the pro-people movements – during the Arab Spring – the report notes a shift, with the platforms now aligning with power. Organised IT cells of political parties have taken over the conversation online.

The report also found that content moderation depends on the dynamics of the political process in a country. Social media platforms intervene with the help of fact-checks or removing content, often when there is a relative political consensus on the way forward.

The spread of disinformation can be checked by identifying and labelling the bad actors and not discretely monitoring and moderating content, the report says.

ADVERTISEMENTREMOVE AD

Transparency Laws and Increasing Digital Literacy is the Way Forward

The report lists out possible steps that can be taken by the social media platforms and governments to tackle the problem of political misinformation. To begin with, the report suggests that India should have a comprehensive transparency law to ensure parity.

Platforms need to rethink their approach to distribution and amplification of content because amplification of a certain type of content does help in setting the narrative and impacts the political process in the country.

The report also suggests platforms to penalise the source of the misinformation instead of individual posts.

Scaling up digital literacy initiatives by social media platforms will also help in reducing misinformation. Some of the other suggestions mentioned in the report are as follows:

Snapshot
  • Bring a comprehensive transparency law for platforms

  • Constitute a regulator under parliamentary oversight

  • Platforms must choose an approach to distribution and amplification

  • Make amplification contingent on the credibility of creators and sources instead of token linkage with a negative list

  • Label content producers instead of individual posts

  • Review super users

  • Set minimum standards for integrity investments, infrastructure, and transparency at country level

  • Reconfigure default user feed to chronological feed

  • Remove design choices which incentivise extreme content

  • Develop an ecosystem approach to fact-checking (Ease, Capacity, and Distribution)

  • Scale digital literacy programs

ADVERTISEMENTREMOVE AD

(Not convinced of a post or information you came across online and want it verified? Send us the details on WhatsApp at 9643651818, or e-mail it to us at webqoof@thequint.com and we'll fact-check it for you. You can also read all our fact-checked stories here.)

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: 
Edited By :Kritika
Speaking truth to power requires allies like you.
Become a Member
×
×