×

Our award-winning reporting has moved

Context provides news and analysis on three of the world’s most critical issues:

climate change, the impact of technology on society, and inclusive economies.

If Facebook is the problem, is a social media regulator the fix?

by Reuters
Wednesday, 6 October 2021 19:17 GMT

Former Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' on Capitol Hill, in Washington, U.S., October 5, 2021. Jabin Botsford/Pool via REUTERS

Image Caption and Rights Information

Days after the Facebook blackout the company is under scrutiny over lack of regulation, and the impact this has on public safety and the health of young people

By Elizabeth Culliford

WASHINGTON, Oct 6 (Reuters) - Facebook whistleblower Frances Haugen told Congress on Tuesday that one option for making social media less harmful would be to create a dedicated regulatory agency to oversee companies like Facebook, which could have former tech workers on staff.

"Right now, the only people in the world who are trained to ... understand what's happening inside of Facebook, are people who grew up inside of Facebook or Pinterest or another social media company," she said during a hearing before a Senate Commerce Committee panel.

How social media companies should be regulated has been a topic of intense debate among lawmakers, regulators and experts. Facebook, the world's largest social network, has been blasted by over a lack of transparency into how its platforms work, its handling of user data and the impact of its sites on users.

Haugen, a former product manager at the company who leaked internal documents to the Wall Street Journal, said the profit motive was strong enough that Facebook, which owns Instagram, would not change without pressure. "Until incentives change at Facebook, we should not expect Facebook to change. We need action from Congress," she said.

Haugen also said that if she were made CEO of Facebook, she would immediately establish a policy that would allow it to share internal research with Congress and other oversight bodies, calling for transparency and public scrutiny of Facebook's systems and decisions.

Nathaniel Persily, a Stanford Law School professor who resigned last year from an effort aiming to get Facebook to share more data with researchers, has argued for legislation to compel social media companies to share their data with external researchers.

"The platforms thrive in secrecy and if you subject them to outside review, it will change their behavior," he said.

Persily said that as action is needed within the next year, he favored the Federal Trade Commission managing the process. "You go to war with the army you have, not the army you may want," he said, though he said a new cabinet department could later be created.

Former Facebook executive Brian Boland, who was in charge of the company's partnership data before resigning this year, said improving transparency was "step one in any kind of regulatory regime."

Tom Wheeler, who was chairman of the Federal Communications Commission, said he envisaged a new, separate agency with the bandwidth and specialism to establish and enforce standards for Big Tech, including on privacy.

On Tuesday, Facebook spokeswoman Lina Pietsch said the company had itself long asked for government oversight. "We have been calling for updated regulations ourselves for two and a half years," she said.

Facebook has previously called for regulation of the internet, including a digital regulator.

Some commentators raised caveats around the idea: Kyle Taylor, program director for a group of critics called the Real Facebook Oversight Board, said a regulator was essential but cautioned against creating a "revolving door" of ex-employees from social media companies joining the body.

Kate Klonick, an assistant professor at St. John's University Law School who studies social media governance, tweeted that such an agency should not be in charge of misinformation as an issue.

REGULATION AND REFORM

During Tuesday's hearing, Haugen also encouraged lawmakers to reform Section 230. She urged the law be changed to hold companies accountable for their algorithms, which often decide what social media users see when they sign in.

"They (companies) have 100% control over their algorithms and Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety. They shouldn't get a free pass on that because they're paying for their profits right now with our safety," she said.

Facebook has said it is in favor of a reform of Section 230 to give the companies immunity from liability only if they follow best practices.

At the hearing, lawmakers did not push back at Haugen's suggestions for various reforms but, in many cases, pointed to legislation aimed along similar lines.

A bipartisan group of senators, including Richard Blumenthal and Marsha Blackburn, introduced a bill in June that would require big internet platforms to allow users to view content that has not been decided by an algorithm.

Haugen also encouraged raising age limits for users of Facebook's platforms from 13 to 16 or 18, given addiction issues on the sites and children's issues with self-regulation.

Under current law, children 12 and under have more protection online than teenagers. There is a bill before Congress to raise the age to 15, among other changes.

Facebook announced in late September, shortly after a report based largely on documents from Haugen that Instagram was harmful to teenagers, that it was pausing its work on a version of Instagram aimed at younger users.

(Reporting by Elizabeth Culliford and Diane Bartz; Editing by Lincoln Feast and Kenneth Li)

Our Standards: The Thomson Reuters Trust Principles.

-->