Pressure Grows on Tech Giants to Solve the Disinformation Riddle

Disinformation in Asia (part 3)

Spanish   French   German   Dutch   English

Listen to this story. 

Technology companies are also facing increased pressure to act on the issue of disinformation, sometimes more so than governments.

Facebook, which has 2.3 billion active monthly users worldwide, perhaps faces the most international pressure to mitigate disinformation on its platform. Mark Zuckerberg’s company has hired local workers in several Asian countries to review and flag misleading or dangerous content.

But Facebook has faced criticism from moderators over deteriorating mental health as a result of the often violent and sexual content which they have to highlight. It has also been criticised for occasionally over- or under-policing specific content in Asia.

Notably, regarding Myanmar, the platform made an official admission in November 2018 that it did not do enough to counteract the spread of disinformation, namely the incitement of racial violence towards the Rohingya by a prominent extremist group, which likely contributed to the deaths of at least 10,000 people.

This admission was a stark reminder of the power of social media in spreading dangerous messages.

Conversely, James Gomez, from the Asia Centre, says technology companies such as Facebook are facing significant pushback from some Asian governments over disinformation, sometimes to the point of over-censoring content. “[Their] ultimate goal is to legislate and intimidate technology companies to censor content at the source,” he says. “This is the challenge companies like Google, Facebook and WhatsApp presently face.”

In a key development affecting countries around the world, in January WhatsApp began restricting the forwarding of messages to five people at a time, over fears the platform was being used, either deliberately or inadvertently, to share misinformation.

Previously, individual users could forward messages to up to 20 users or groups at a time. The encrypted messaging service, owned by Facebook, has faced particular criticism for emboldening groups spreading disinformation because its closed nature means it cannot be independently moderated or fact-checked.

The changes were introduced after a trial in India last year, following the spread of messages which led to killings and attempted lynchings, Reuters reported. But the restrictions will likely only serve to slow down, rather than stop, the dissemination of disinformation and misinformation on the platform.

Article by Rachel Blundy.
Editing by Mike Tatarski and Anrike Visser.
Illustrations by Imad Gebrayel.

Read part 1 and part 2 of this series on disinformation.

Taking you where others don't
Ready to make sense of foreign news?

By subscribing you agree that your information will be transferred to MailChimp for processing in accordance with their Privacy Policy (https://mailchimp.com/legal/privacy/) and Terms (https://mailchimp.com/legal/terms/).