Who knew? Efforts that started in 2004 as a prank website to compare and rate girls as “hot or not” would culminate into a global platform used to facilitate genocide in Myanmar, subvert elections in the United States, and fuel riots in India.
Facebook has rewritten the rules of communication and civic engagement within a mere 17 years. However, recent whistleblower testimony before US Congress and internal company documents have exposed alarming details how these new rules essentially spiralled into a cesspool of hatred, misinformation and anger globally.
While skeletons have steadily tumbled out of Facebook’s closet for a few years now, the ‘Facebook Papers’ and whistleblower Frances Haugen’s testimony shed light on the sordid features of those skeletons.
What Are Facebook Papers?
The ‘Facebook Papers’ are a cache of disclosures made to the US Securities and Exchange Commission (SEC) by Haugen, a former Facebook product manager, and provided to US Congress in redacted form by her legal counsel. A consortium of 17 US news organisations has obtained the redacted versions received by Congress and published a variety of alarming findings from the papers.
The documents provide rare, vivid insight into the business practices of the “unaccountable” Chief Executive Officer Mark Zuckerberg who, reports reveal, repeatedly prioritised profit and growth over user safety.
In India, this led to political hate speeches going unchecked, put minorities in danger and allowed rampant misinformation to flourish.
This piece breaks down the core revelations about Facebook’s mess and categorises the recent findings into four interconnected issues: algorithms, policies, resources and profit motives.
Algorithms: How They Impact Hate Speech Proliferation
In February 2019, a Facebook researcher created a new user account to test the social media site as a person living in Kerala. For the next three weeks, the account operated by a simple rule: follow all the recommendations generated by Facebook’s algorithms to join groups, watch videos and explore new pages on the site, the New York Times reported.
The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month.
“Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” the Facebook researcher wrote
This horrific revelation ties into the recent research and testimonies of various Facebook insiders, including Haugen.
They have explained how the social network’s algorithms generate recommendations designed to lead conservative accounts into extremist and hateful rabbit holes.
Facebook is among the world’s most valuable company that built its billions selling ads by third-party advertisers. Essentially, the advertisers are Facebook’s clients who are paying Mark Zuckerberg billions for the data of over a billion people across Facebook and Instagram.
The longer we stay engaged on the platform, the more likely we are to encounter ads and engage with them. In the process we’re providing valuable monetisable data to the platform and advertisers about our behaviour, preferences, habits.
Internal Policies: When Business Means More Than Ethics
According to news organisations that have ploughed through the troves of internal documents, Zuckerberg’s intervened to prevent mitigation of hate speech in developing countries like India. Washington Post reports that while Zuckerberg played down reports that the site amplified hate speech in testimony to Congress, he was aware that the problem was far broader than publicly declared and the platform was polarising people. Internal documents seen by the Post claim that the social network had removed less than five percent of hate speech.
In India, Between 14 August and 1 September, a string of major allegations emerged against Facebook’s India operations and its top executives. Wall Street Journal’s report on 14 August claims that despite the insistence of Facebook’s employees – responsible for policing the platform – to permanently ban the profile of BJP MLA from Hyderabad T Raja Singh for promoting hate speech, Ankhi Das, former Public Policy Executive in India, blocked applying hate speech rules to Singh.
According to the report, Das told staff members “that punishing violations by politicians from Modi’s party would damage the company’s business prospects in the country, Facebook’s biggest global market by number of users”.
Resources: Shortage of Staff Compounds the Problem
A large part of hate speech goes unchecked because of inadequate human moderators to go through content flagged for objectionable content in Indian languages. Facebook’s AI is unable to catch nuances that vary from one language to another.
Reuters reported that Facebook has serially neglected a number of developing nations, allowing hate speech and extremism to flourish. That includes not hiring enough staffers who can speak the local language, appreciate the cultural context and otherwise effectively moderate.
Haugen told the British Parliament that Facebook devotes 87% of the spending on combating misinformation on English content when only 9% of users are English speakers. In an internal document, called Adversarial Harmful Networks: India Case Study, Facebook researchers wrote that there were groups and pages “replete with inflammatory and misleading anti-Muslim content” on Facebook.
Of India’s 22 officially recognised languages, Facebook said it has trained its AI systems on five. But in Hindi and Bengali, it still did not have enough data to adequately police the content, and much of the content targeting Muslims “is never flagged or actioned,” the Facebook report said.
Profit: The All-Winning Principle
On 26 October, Facebook reported a profit of over USD9 billion for its third quarter. Facebook reported USD29 billion in revenue for the three months ended in September, up 35% from the same period a year earlier. The company posted nearly $9.2 billion in profit, up 17% from the year prior. The number of people using Facebook's family of apps grew 12% year-over-year, to nearly 3.6 billion during the quarter.
And yet, Mark appears to be ill at ease. He started his earnings call on Monday by launching into a vehement defence against the leaks. "Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company," Zuckerberg said.
Future of Facebook
So, where is Facebook headed?
Despite the surge in profits, the CEO has cause for worry. While it stands today as the world’s seventh largest company with a market cap of USD890 billion, the road ahead looks far from rosy. Two issues, in particular, can dent the company’s long-term profits – the one thing that Zuckerberg cares above all else.
First, leaked documents reveal the company has struggled since 2012-13 to attract young users. Engagement is a vital metric for the company and teenagers are a crucial target segment for advertisers. People below 25 are spending less time on the platform, and fewer teenagers in developed countries are signing up. This is a major headache for Zuck & Co.
Haugen alleges that Facebook “has misrepresented core metrics to investors and advertisers,” and that duplicate accounts are leading to “extensive fraud” against advertisers.
Second, the barrage of negative headlines has affected its ability to recruit top talent. “Facebook is extremely thinly staffed…and this is because there are a lot of technologists that look at what Facebook has done, and their unwillingness to accept responsibility, and people just aren’t willing to work here,” Haugen said at a briefing. Facebook’s future growth depends on its ability to attract top talent.
As more findings emerge from the Facebook Papers, expect calls to legislate regulation for the company. The clamour for Zuckerberg to step aside will only grow louder. Those investing in this Frankenstein must speak up.
(Sushovan Sircar is an independent journalist who reports on technology and cyber policy developments. His reports explore stories at the intersection of internet and society, covering issues of privacy, surveillance, cybersecurity, India’s data regime, social media and emerging technologies. He tweets @Maha_Shoonya. This is an opinion piece, and the views expressed are the author’s own. The Quint neither endorses nor is responsible for them.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)