The scene was familiar Wednesday: A congressional hearing in which Facebook parent Meta Platforms Inc. and other tech companies were held responsible for what happens on their digital platforms.
And a familiar face was there to deliver the message. Facebook
whistleblower Frances Haugen testified to the wanton negligence of the social-networking giant and its toxic influence on kids and teens.
There was one difference this time around, however. Rather than theatric finger-wagging, members of the House Committee on Energy & Commerce are fine-tuning legislation that could reshape Section 230 of the Communications Decency Act, which shields social-media companies from third-party content that appears on their digital platforms.
The marathon hearing (three hours and counting), ominously titled “Holding Big Tech Accountable: Targeted Reforms to Tech’s Legal Immunity,” comes at a time of mounting reckoning for tech’s biggest players amid a maelstrom of antitrust legislation, lawsuits and regulatory fines.
“There is a bipartisan desire” to make companies like Facebook, Twitter Inc.
and Google GOOGL GOOG legally liable for algorithms that amplify content that leads to offline violence, committee chairman Mike Doyle (D., Pa.), chairman of the Subcommittee on Communications and Technology, said. “These platforms don’t want to held accountable.”
Filled with horrifying details of drug addiction, death, mental illness and sexual abuse among social-media victims, the hearing was the latest attempt to push through legislative efforts to “target reforms” for tech’s controversial legal liability shield.
Haugen, who in October testified before the Senate Commerce Committee shortly after she leaked a cache of internal Facebook documents to the Wall Street Journal, covered familiar territory. “Facebook has hidden from you countless ways to make things safer” in pursuit of “profits over people,” she said Wednesday.
Meta said it is open to updated internet rules from Congress, a stance it has espoused for three years, it said. “We are a platform for free expression and every day have to make difficult decisions on the balance between giving people voice and limiting harmful content,” a Meta spokesperson told MarketWatch in an email. “It is no surprise Republicans and Democrats often disagree with our decisions – but they also disagree with each other.”
The social media company has repeatedly claimed it has taken extreme measures in security spending and hiring (it spent $5 billion this year) to tamp down on misinformation and hate speech coursing through its sprawling platform. Meta Chief Executive Mark Zuckerberg has testified several times on Capitol Hill the past few years in an attempt to reassure lawmakers, and Instagram head Adam Mosseri is scheduled to testify for the first time before a Senate subcommittee on Dec. 8.
But it has done little to counteract a backlash over internal research leaked by Haugen. The research found Instagram made body image issues worse for one in three teen girls, for example.
“We are at a watershed moment. We have literally been over a decade without major reforms for these companies, and we’ve assumed that in some cases they would self-police or self-regulate. Well, that’s not true and the record is clear,” Jim Steyer, CEO of Common Sense Media, told the House panel. “I would argue in the next three to six months the most important legislation, including some of the legislation that this subcommittee is considering today will move forward and will finally put the guardrails on that America’s children and families deserve.”
While the Biden administration reportedly eyes data privacy as a civil-rights issue, European regulators are sharpening their assault. On Tuesday, Meta was ordered by UK regulators to sell social-media animated company Giphy on the premise it is harmful to users and advertisers.
“We have decided that the only effective way to address the competition issues that we have identified is for Facebook to sell Giphy, in its entirety, to a suitable buyer,” the Competition & Markets Authority said.
“The whistleblower’s reports and testimony did a lot of damage to Facebook,” Ashley Baker, director of public policy for The Committee for Justice, told MarketWatch. “Her testimony identified legitimate concerns that particularly resonate with parents on a personal level. If the impact of the whistleblower is that fewer teenagers are allowed by their parents to use Facebook or Instagram, that certainly impacts the company by taking away part of their user base and is bad for the long-term popularity of the platform.”