Facebook Whistleblower Sophie Zhang On Why Democracy Is In Danger And What We Can Do – India Times
In September 2020, former Facebook employee Sophie Zhang laid bare the company’s far-ranging threats on its pool of platforms, with Facebook at the centre. While user privacy and misuse of user data have become the dangerous markers of a technology-giant-gone-rogue, the focus has now shifted to Facebook’s larger threat to democracy and free speech.
Zhang witnessed the manipulation of public discourse in action first-hand, which compelled her to come forward with startling revelations about how “bot accounts” are used on the platform by governments and third-party actors to sway political mood of the public in different regimes, including India.
Referring to the bot accounts as what Indians commonly understand as often government-backed “IT cells”, Zhang shed light on how these accounts are used by various agencies on Facebook to spread to shape the country’s political mood. In a memo widely circulated among Facebook employees after she was fired, Zhang alleged that Facebook self-interest and the greed for profit prevented the company from making proactive decisions about protection of democracy, a claim corroborated by another Facebook whistleblower Frances Haugen.
Zhang recently gave a testimony to British lawmakers, wherein she explained how a loophole allowed “fake engagement” to continue unabated on Facebook in the form of likes, comments, and shares. Essentially, while individual actors on Facebook may be required to verify their identity, there are no mechanisms in place to verify brand and business accounts.
These series of automated accounts, when set up by political actors in a country are known as “coordinated inauthentic behaviour” or CIB, the same mechanism allegedly used by Russian authorities to sway public mood in 2016 US elections that ended in the victory of controversial figure Donald Trump.
Facebook’s three biggest markets include India (340 million users), United States (200 million), and Indonesia (140 million) with a global user base of 2.89 billion monthly users. Whistleblower Francis Haugen’s testimony to US senators in October revealed that 87 per cent of all Facebook anti-misinformation resources are allocated to the US, with the rest of the world receiving only 13 per cent of all resources.
Also read: Hateful Ads On Facebook Are Cheaper Than Other Ads, Says Facebook Whistleblower
According to Zhang, this represents the power held by Facebook in world’s biggest democracies and how little it does to preserve the same democracies. “In countries like the US, the UK and India, the power that Facebook holds consists of a threat to existing public institutions and democratic safeguards,” Zhang told Indiatimes.
Even then, Facebook may only be counterproductive in democracies, Zhang explained. “In countries where democratic safeguards don’t exist and freedom of speech doesn’t exist, Facebook can be better than nothing… In those countries, having Facebook is better than not having Facebook at all.”
For starters, Zhang believes Facebook is just like another company that wants to maximise its profits at any cost, even the well-being of its users and the health of democracies. While Zuckerberg attempted to absolve himself of blame by suggesting that “polarisation” in society predates not only Facebook, but Zuckerberg himself, Zhang told Indiatimes that this is how Facebook continues to absolve itself of any responsibility.
So, does Facebook even care? Yes and no, Zhang believes, while adding that “Facebook cares about its users in the sense that its users are a source of profit. And it does not want a source of profit to go away.”
For Zhang, the question of responsibility and culpability is more about how an individual Facebook employee feels about the dangerous content shared on its platforms and how little they’re equipped to deal with given the new-emerging threats that continue to appear on the platform.
Also read: New Whistleblower Alleges Facebook’s Poor Response To Hate Speech & Misinformation
“Facebook did care quite a bit. For instance, when in, I believe 2019, 2020 rumours in India grew viral on WhatsApp [that led to] mass lynchings and murders of innocent people.”
After misleading messages triggered a series of mob lynchings in India through Facebook-owned WhatsApp, the company introduced a series of features including a cap on how many chats a forward message may be sent to and a warning label for “suspicious links.”
This response, according to Sophie Zhang, on Facebook’s part was due to the fact that the cause of lynching “was very direct and emotionally persuasive to people that other concerns to democracy were not.”
While this may be true, the larger-looming threat of political manipulation is still not water under the bridge.
As discussed in Part 1 of our interview with Sophie Zhang, most of her concerns were disregarded by Facebook when she first made disturbing discoveries about political manipulation.
Zhang’s concerns were buried due to a variety of reasons. For her, it was due to her inability to express herself clearly to not only Facebook but to the world. On Facebook’s end, it was simply because the sensitive information Zhang had uncovered about mass manipulation in authoritarian regimes like Russia and Belarus was simply “useless to the company.”
The discoveries made by Zhang were not part of her full-time duties at Facebook, but instead she dedicated her spare time to focus on the project, which led to her eventual dismissal from the role.
Also read: New Whistleblower Alleges Facebook’s Criminality, Says She Has ‘Blood On Her Hands’
Zhang explained to us how “bot accounts” undertake manipulation campaigns on Facebook through “self-compromised accounts”, the most common method used in India. These “self-compromised accounts” allow for nefarious activity to continue by bad actors on accounts that belong to real people. This access is achieved through manipulation on Facebook itself, “I can use your account to do something nefarious even while you keep access to it yourself,” Zhang told us.
Why is there still no concrete action on Facebook’s part? That’s because Facebook usually wakes up from its deep slumber of inactivity after a problem has become uncomfortably deep-seated, Zhang asserted.
In this case, the problem is a lot bigger than just fake accounts. It’s the very threat to democracy and social harmony that culminated in a violent insurrection attempt on US Capitol in January this year. It’s the same discomfort that birthed a series of lynchings in India based on forwarded messages, and Zhang believes Facebook will not take responsibility until people and governments unite to get their message across.
A month ago, another Facebook whistleblower Frances Haugen spoke in front of US lawmakers, urging them to do something about the Facebook problem before it’s too late. “It’s frankly, quite a surprise to me that Francis Haugen was able to get away with looking through documents as much as she did because she had no business reason to be looking through them.”
According to Zhang, Facebook has people who look for potentially whistleblowers who could have easily found her but did not.
Even then, something that is built into the very primal nature of Facebook let Haugen get away with a treasure trove of internal information about Facebook’s misdemeanours. “But thinking about it, it’s not as surprising because one of my main criticisms of Facebook is that they’re fundamentally reactive.” Zhang believes Facebook is not a company that would “act to get in front of issues.”
“They wait for things to happen and become problems and then fix them. And so it appears to me that they took the exact same approach with Francis Haugen.”
Also read: Whistleblower Alleges Facebook’s Dismal Response To Hate Speech In India
While Haugen’s revelations about Facebook focused on “misinformation, hate speech, polarisation and inflammatory content that are serious issues for democracy”, Zhang worked on “inauthenticity and fake accounts.”
“This may sound very similar to misinformation, but there’s actually no relationship at all”. While misinformation, Zhang said is a “function of what is being said”, her work profile was focused on “who is saying it”.
“Francis was way more prepared and ready for this than I was,” Zhang claimed while explaining why her concerns weren’t taken as seriously as Haugen’s when she first came out with the startling claims. “I am not as polished and poised as Francis,” Zhang said.
While Facebook may enable authoritarian tendencies in countries that are democracies, it also helps in creating a communication bubble among people living in authoritarian regimes that pre-date the creation of Facebook, according to Zhang’s claims.
Even then, Facebook’s biggest contemporary victim remains “democracy.” “The fundamental biggest threat posed by Facebook is that it has no interest in protecting democracy.”
While referring to the US Capitol attack that left five people dead, Zhang explained her fears for the future with Facebook as a leading voice and shaper of public discourse. Zhang believes that Facebook could apologise if a similar event were to happen in the future while taking no responsibility. “Facebook will be like – I’m sorry, or claim that this isn’t their responsibility,” with the classic promise of “stopping this from happening again.”
Also read: Facebook Whistleblower: FB Knows You’re Addicted But Won’t Do Anything About It
While Facebook’s products inadvertently compromise democracy, their targeted products also harm children. Facebook had planned to unveil its under-13 kids version of Instagram, which it delayed indefinitely in September 2021 after an internal report highlighted the issues that Instagram caused among kids.
A series of serious claims involved a high rate of self-image issues and the subsequent risk of self-harm. In addition, Instagram’s effects on young girls was found to be especially predatory, according to the internal leaks. The biggest information from this leak was that Facebook was well-aware of its problems, yet continued to pursue the development of an age-specific app for kids.
“I believe that Facebook is no longer used by younger people as much. And the habits formed by people in childhood tend to, to stay with them for a lifetime.” For this very same reason, Zhang believes Facebook is attempting to widen its user base among kids so that its monopoly is effectively maintained.
For starters, Zhang believes Facebook needs to address the harm embedded into its pool of products. Just like cigarette companies have mandatory labels on their boxes about users potentially developing cancer, Facebook needs a similar warning screen, Zhang suggested.
”That is a very basic example of what would probably be a very controversial change to Facebook,” Zhang said while adding that Facebook should have a warning on the home page. “This product will ruin your [mental health], this product is addictive,” she said.
In addition, the company’s role in civil society may need to be re-orchestrated. Zhang feels that Facebook is expected to fix problems while inserting itself into the general functioning of a country. The people at Facebook in-charge of fixing problems are “the same people charged with keeping good relationships with local politicians, governmental members, and helping Facebook grow. This creates a natural conflict of interest.”
Also read: Friend’s Death From Misinformation Forced Whistleblower To Reveal FB’s Dark Side
Comparing Facebook to dictatorships, Zhang further claimed that for democracy to survive, the same set of rules need to be in place for all people. “Democracy cannot survive if there is one set of rules for the rich and powerful and one set for everyone else. But that is essentially the system at play in Facebook right now.”
To end Facebook’s monopoly, Zhang believes Indians need to take charge of the situation. “Ultimately, I think it’s up to individual Indians to demand action by the government.” She also urged Indians to “press Facebook” because “Facebook will not solve its problems without people forcing it to.”
Smartphones are especially cheap in India, with the lowest range of devices costing about ₹3,000 ($40). This, for Zhang, represents the dangers embedded in forwarded messages and viral content on platforms like Facebook-owned WhatsApp. Zhang made it extremely clear to us that she wasn’t involved in any projects on WhatsApp, but still believes the company needs to do more to curb misinformation.
Facebook claims WhatsApp’s end-to-end encryption is a solid tool to prevent misuse of the platform. But like many others, Zhang isn’t convinced. “Perhaps a single user can control hundreds of WhatsApp accounts and use them to [spread misinformation and hate speech].”
Sophie Zhang believes governments threatening people for “forwarding misleading WhatsApp messages” isn’t the way forward. Instead of putting the onus on individual messages that “are difficult to judge and breaches freedom of speech”, “adding friction to the messaging process” may help. These friction measures include requiring WhatsApp to crack down “on the use of fake WhatsApp accounts” and by limiting the amount of mass messages that can be sent out, essentially making it difficult to forward a message.
“It is true that in the past disasters have happened because of WhatsApp messages, like for instance, the lynching of innocent people in India in the past few years.”
Even though Mark Zuckerberg is already attempting to leave behind the controversies associated with Facebook, Zhang believes Zuckerberg needs to realise the responsibility of “having his product used by billions worldwide” especially given its contribution to the “degradation of democracy and the destruction of civic discourse.”
The problem with social media, according to whistleblower Sophie Zhang, is in how the platforms were built. “I would say that this is the road that social media has built. A road in which outrage is rewarded more than kindness and calmness… That’s when sensationalism is rewarded more than nuance,” she told Indiatimes.
A slew of internal documents called “Facebook Papers” revealed by whistleblower Frances Haugen alleged that the company’s top bosses are well-aware of the problem, but still won’t do anything to fix it due to a variety of reasons, including “political considerations.”
Also read: Facebook Targeted 6-Year-OId Kids To Widen User Base, Revealed Internal Blog
“A broken system rots from the top. When people are trapped in a broken system, that affects all of them [as well as] their actions and their motives,” Zhang told us. Due to this inaction on Facebook’s part, “People [on Facebook] have learned to use sensationalism, to use anger, to use hate, and spectacle” with “virality” at the centre of all discourse.
Social media platforms like Facebook, Twitter and Instagram have accentuated the “viral” culture, wherein whatever gets the most hits and “likes” shapes the pop culture discourse at that juncture in time. To Zhang, this is still relatively new, even though we’ve come to believe that it is “typical”.
Even then, censorship may not be the answer to Facebook’s excesses. In fact, Zhang proposed a simpler solution. For starters, Facebook should reverse to displaying a chronological newsfeed instead of curated feeds that claim to cater to a user. While Facebook claims user-specific feeds improve user experience, sometimes the algorithm leads users to content that Facebook believes they would like, without taking into account what the content may be. For instance, a Facebook experiment showed how the platform directed users who liked “Fox news” to dangerous “QAnon” conspiracy groups in the United States.
In addition, Zhang wants Facebook to “stop the ability to re-share content.” This doesn’t have to be arbitrary. In fact, Zhang proposed a cap on the number of times a post may be re-shared before a user is required to go visit the original post on the user profile before sharing it. “For instance, preventing people from sharing content, if it has already been shared more than three times.”
A balancing act between the preservation of freedom of speech and Facebook’s excesses is key to preventing a crisis in democracies, Zhang is convinced. Circling back to what both Haugen and Zhang initially started their arguments with, the ex-Facebook employee told us – “Ultimately, Facebook is not willing to take some of these steps because it would hurt its profit.”
“India is the world’s largest democracy I very much hope it stays the world’s largest democracy. It was founded on the principle of harmony and equality, on the ideas of Mahatma Gandhi, and India should not forget those roots,” Zhang said towards the end of our chat, in a message to Indians who use Facebook’s products.
Do you think Facebook should be given a free pass or regulated more fiercely in this era of misinformation and disinformation? If so, are governments that undertake manipulation campaigns to be trusted with regulatory methods? Only time will tell.
In the meanwhile, keep reading Indiatimes.com for the latest in technology and science. If you missed Part 1 of Sophie Chronicles, you may read it here. Don’t forget to share your thoughts with us in the comments below.
Start a conversation, not a fire. Post with kindness.
Get the NEWS that fits your groove.
Subscribe to Indiatimes and get handpicked updates based on your interests!
Connect With Us On