* Any views expressed in this opinion piece are those of the author and not of Thomson Reuters Foundation.
Governments around the world must adopt data protection laws that actually limit Facebook’s business model
By Carolyn Tackett, deputy advocacy director, global campaigns lead at Access Now
Early reports on the Facebook Papers have laid bare what civil society around the world has warned of for years now — Facebook is much more interested in rapidly expanding its user base and leveraging people’s data to maximize profits than in ensuring everyone can use its platforms safely.
For more than a decade, grassroots activists, researchers, digital security help desks, and others serving at-risk communities have been diligently documenting and presenting Facebook with mounting evidence of its failed policies’ negative impact, both in people’s lives and on our societies.
In India, Palestine, Ethiopia, Myanmar, Syria, Australia, Tunisia, and so many other places around the world, Facebook’s “move fast and break things” model has had dire consequences for human rights defenders and journalists, along with women, LGBTQ+ communities, people facing discrimination based on race, religion, or caste, and other marginalized groups.
In many cases, Facebook is still relying on resource-strapped civil society actors, often struggling themselves to navigate a tense and complex landscape, rather than investing in its own capacity to meet its users’ needs.
Worse yet, instead of heeding civil society’s warnings that its platforms were ripe for abuse and ill-prepared to respond, Facebook has consistently ignored this guidance until after harm has already occurred, if it takes action at all.
In response, Facebook points out it has spent around $13 billion on “safety and security” since 2016, building AI tools to identify content that violates its community standards. But to put that into perspective, the company spent nearly the same amount on marketing just in 2020 alone, and more than three times as much — $37.8 billion — from 2016 to 2020.
And even where the company is putting in limited resources, the AI systems Facebook is betting on to solve its content moderation issues have proven ineffective, in large part because these systems cannot adequately navigate the complexities of various languages and contexts. That only gets worse when Facebook spends a mere 16% of its content moderation budget outside the United States, which only represents about 10% of its global user base.
Facebook has pledged over the years to fulfill its responsibility to protect human rights, including in its long-awaited human rights policy released earlier this year. But it has also consistently fallen short of these commitments, and continues to sidestep demands for more transparency.
The company has even evaded sharing essential information in investigations by the Oversight Board — a system Facebook implemented to hold itself accountable.
It is clear that Facebook, left to its own devices, is not willing or able to do what is necessary to prevent even the most extreme forms of violence and abuse from spreading across its platform, or to protect the most vulnerable from being silenced.
Even now, as the company is confronting some of the harshest and most undeniable criticism in its troubled history, it is attempting to douse a PR fire with a costly rebrand instead of doubling down on investments in safety, security, and integrity.
Facebook’s leadership needs a complete paradigm shift, where every decision is made from the perspective of those most susceptible to harm. Meeting the company’s responsibility to uphold human rights requires putting the most vulnerable people first — not somewhere on the list after the bottom line, political elites, influential celebrities, and others.
That means serious consultation with affected communities and civil society — not cursory meetings where input is readily ignored. That means enabling responsible access to data for independent stakeholders with relevant expertise.
And that means completely reimagining the company’s exploitative and opaque business model, along with an infusion of human rights expertise among Facebook’s executive leadership.
Some Facebook shareholders have tried, to no avail, to change the calculus on these matters and challenge the outsize influence Mark Zuckerberg has over the company. We need more shareholders to stand up and reinforce that increasing profit margins cannot come at the cost of our democracies and our lives.
Perhaps most urgently, governments around the world need to adopt data protection laws that actually limit Facebook’s business model. In particular, U.S. policymakers are uniquely positioned to influence how Facebook and other Big Tech companies behave around the world.
They have failed for far too long to adopt a strong, forward-thinking data protection framework, and that regulation vacuum has, at least in part, enabled Facebook’s rapid but highly inequitable growth.
Access Now and our civil society partners around the world will continue to keep a close eye on the emerging Facebook Papers stories, and encourage the stewards of those resources to give digital rights experts an opportunity to analyze them as well.
Together we can achieve meaningful change, and envision a more free and empowering digital space for all.
Our Standards: The Thomson Reuters Trust Principles.