Thousands of pages of internal documents provided to Congress by a former employee paint a picture of a company in internal conflict where the data on the damage it causes is abundant, but the solutions, much less the will to act on it, are best stop.
The crisis revealed by the documents shows how Facebook, despite its regularly avowed good intentions, seems to have slowed down or sidelined efforts to remedy the real damage that the social network has amplified and sometimes created. They reveal many instances where researchers and grassroots workers uncovered deep-rooted issues that the company then overlooked or ignored.
Ultimate responsibility for this lies with CEO Mark Zuckerberg, who wields what a former employee described as dictatorial power over a company that collects data and provides free services to around 3 billion people around the world.
“Ultimately, that’s up to Mark and whatever his prerogative is — and it’s always been about growing, increasing his power and his reach,” said Jennifer Grygiel, a communications professor at Syracuse University who has been following Facebook closely for years.
Zuckerberg has an iron grip on Facebook Inc. He owns the majority of the company’s voting shares, controls its board of directors, and surrounds himself with more and more executives who don’t seem to question his vision.
But it has so far been unable to cope with sluggish user growth and declining engagement for the Facebook product in key regions such as the US and Europe. . Worse, the company is losing the attention of its most important demographic – teenagers and young people – with no clear path to reclaim it, its own documents reveal.
Young adults interact with Facebook much less than their older cohorts, seeing it as an “outdated network” with “irrelevant content” that offers them limited value, according to a November 2020 internal document. It’s “boring, misleading and negative,” they say.
In other words, young people see Facebook as a place for old people.
The documents show how Facebook, despite its regularly avowed good intentions, seems to have slowed down or set aside its efforts to remedy the real damage that the social network has amplified and sometimes created.
Facebook’s user base is aging faster, on average, than the general population, the company’s researchers found. Unless Facebook finds a way to reverse the situation, its population will continue to age and young people will find even fewer reasons to sign up, threatening the monthly user numbers that are essential to selling ads. Facebook says its products are still widely used by teenagers, although it acknowledges there is “fierce competition” from TikTok, Snapchat and more.
In order to be able to continue to expand its reach and power, Facebook has been pushing for strong user growth outside of the United States and Western Europe. But as it expanded into less familiar parts of the world, the company consistently failed to address or even anticipate the unintended consequences of signing up millions of new users without also providing the personnel and systems to identify and limit the spread of hate speech, misinformation and appeals. to violence.
In Afghanistan and Myanmar, for example, extremist language has flourished due to a systemic lack of language support for content moderation, whether human or artificial. In Myanmar, it has been linked to atrocities committed against the country’s minority Muslim Rohingya population.
But Facebook seems unable to recognize, let alone prevent, the real collateral damage that accompanies its unchecked growth. These misdeeds include obscure algorithms that radicalize users, pervasive misinformation and extremism, facilitation of human trafficking, teen suicide and more.
Internal efforts to mitigate these problems have often been sidelined or abandoned when the solutions conflict with growth – and, by extension, profit.
Backed into a corner with hard evidence from leaked documents, the company doubled down on defending its choices rather than trying to solve its problems.
“We don’t and we haven’t prioritized engagement over safety,” Monika Bickert, Facebook’s head of global policy management, told The Associated Press this month. following testimony to Congress by whistleblower and former Facebook employee Frances Haugen. In the days following Haugen’s testimony and appearance on ’60 Minutes’ – during which Zuckerberg posted a video of himself browsing with his wife Priscilla Chan – Facebook attempted to discredit Haugen by repeatedly pointing out that she had not directly worked on many of the issues she revealed.
“A curated selection from millions of documents on Facebook can in no way be used to draw fair conclusions about us,” Facebook tweeted from its PR “newsroom” account earlier this month, at following the discovery by the company that a group of news organizations were working on stories on internal documents.
“At the heart of these stories is a premise that is false. Yes, we are a business and we make a profit, but the idea that we do so at the expense of people’s safety or well-being does not understand where our own business interests lie,” Facebook said in a prepared statement Friday. “The truth is, we’ve invested $13 billion and have over 40,000 people doing one job: keeping people safe on Facebook.”
Statements like these are the latest sign that Facebook has entered into what Sophie Zhang, a former Facebook data scientist, described as a “siege mentality” within the company. Last year, Zhang accused the social network of ignoring fake accounts used to undermine foreign elections. With more whistleblowers – notably Haugen – coming forward, the situation has only gotten worse.
Young people see Facebook as a place for older people.
“Facebook has gone through a bit of an authoritarian narrative spiral, where it becomes less susceptible to employee criticism, internal dissent, and in some cases suppresses it,” said Zhang, who was fired from Facebook in the fall of 2020. “And that leads to more internal dissent.”
“I have seen many co-workers extremely frustrated and angry, while feeling helpless and (disheartened) about the current situation,” one employee, whose name has been redacted, wrote on an internal message board after the decision to Facebook last year. to leave inflammatory messages from former President Donald Trump suggesting Minneapolis protesters could be shot. “My view is that if you want to fix Facebook, do it from the inside.”
This story is based in part on disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including the Associated Press.
They detail painstakingly collected data on issues as broad as the trafficking of domestic workers in the Middle East, an over-correction of a crackdown on Arabic content that critics say stifles free speech while hate speech and abuses are on the increase, and misinformation endemic against vaccines. which the researchers found could have been easily mitigated with subtle changes to how users view posts on their feed.
The company insists that it “does not conduct research, then systematically and deliberately ignores it if the results are inconvenient to the business”. This assertion, Facebook said in a statement, can only “be made by selecting selective quotes from individual items disclosed in a way that presents complex and nuanced issues as if there were never that ‘only one correct answer’.
Haugen, who testified before the Senate this month that Facebook’s products “harm children, stoke division and weaken our democracy,” said the company would have to declare “moral bankruptcy” if it wants to get out of it all. that.
At this point, that seems unlikely. There’s a deep-seated conflict between profit and people within Facebook – and the company doesn’t seem ready to let go of its narrative that it’s good for the world, even as it regularly makes decisions intended to maximize growth. .
“Facebook has conducted regular employee surveys – what percentage of employees think Facebook makes the world a better place,” Zhang recalled.
“It was about 70% when I came in. It was about 50% when I left,” said Zhang, who had been with the company for more than two years before being laid off in the fall of 2020.
Facebook did not say where the number is today.