Mark Zuckerberg, Linda Yaccarino, Evan Spiegel and other leading social media executives face a grilling on Wednesday from lawmakers concerned about child exploitation and safety on their services.
The three CEOs—who lead Meta, X (formerly known as Twitter) and Snap, respectively—are scheduled to appear in Washington alongside TikTok CEO Shou Zi Chew and Discord CEO Jason Citron, as witnesses at a Senate Judiciary Committee hearing about safeguarding children on their respective platforms.
Lawmakers from both sides of the aisle have blasted the companies for failing to properly address what some have called a “plague of online child sexual exploitation” on social media apps. The goal of the hearing is to inform legislation that members of Congress believe is needed to compel the firms to do more to protect children on their platforms.
In November, committee chairman Sen. Dick Durbin, D-Ill., and ranking member Sen. Lindsey Graham, R-S.C said that they issued subpoenas to Yaccarino, Spiegel and Citron to testify at the hearing.
Wednesday’s hearing is focused specifically on issues pertaining to child exploitation and the prevalence of child sexual abuse material on social media. But the overarching theme is that these under regulated tech companies have designed and built platforms that are addictive, and which damage the mental well-being of children and young adults.
In recent months, Meta has been hit with a number of lawsuits related to the well-being of children on its apps like Facebook and Instagram. New Mexico’s attorney general filed a civil suit against Meta in December, alleging that the company’s apps are enabling sexual predators to exploit children and distribute CSAM and that the company failed to address the problem because its leadership was more focused on growth.
That lawsuit was filed shortly after a bipartisan group of over 40 attorneys general filed a joint federal lawsuit alleging that Meta knowingly designed addictive apps that are detrimental to children’s mental health and have contributed to problems like teenage eating disorders.
Meanwhile, Meta, Snap, TikTok and Alphabet (via it’s Google YouTube unit), face ongoing multi-district litigation involving a coalition of school districts and individuals who also allege the companies’ products are addictive and harmful to the mental well-being of children and young adults.
One of the bills that lawmakers are proposing as a possible solution to child exploitation problems includes the Stop CSAM Act, which would let victims of online child sexual exploitation sue “tech platforms and app stores that promoted or facilitated the exploitation, or that host or store CSAM or make it available,” according to the Senate Judiciary Committee.
Lawmakers have also been pushing the Kids Online Safety Act (KOSA), which would establish a so-called “duty of care” for tech firms that requires them to provide more parental controls and undergo annual audits intended to asses their platform risks to children and young adults, among other tasks.
Still, organizations like and privacy advocates like the Electronic Frontier Foundation and the American Civil Liberties Union have voiced concerns over these proposed bills, and are worried that they could lead to the censorship of reproductive rights and other sexual health and orientation information and potentially compromise the privacy of minors via unnecessary surveillance.
The social media executives are expected to detail their efforts combating child exploitation on their platforms, which include working with law enforcement and tasks like proactively identifying potential predators.
Zuckerberg will describe Meta’s online child safety related efforts, emphasizing that the company has around “40,000 people overall working on safety and security” and that the company has invested over $20 billion on those efforts since 2016.
Zuckerberg will also say that Meta invested $5 billion in 2023 alone on these safety efforts, according to his prepared testimony.
“We’re committed to protecting young people from abuse on our services, but this is an ongoing
challenge,” Zuckerberg will say. “As we improve defenses in one area, criminals shift their tactics, and we have to come up with new responses. We’ll continue working with parents, experts, industry peers, and Congress to try to improve child safety, not just on our services, but across the internet as a whole.”
Spiegel will detail some of the initiatives Snap has undertaken to safeguard children on its messaging platform. Spiegel, differing from the other executives at the hearing, also pledged support for KOSA and the Cooper Davis Act, which would require communications firms to report instances involving of various drug-related offenses to the Drug Enforcement Administration.
“We support this legislation, not only in word, but in deed, and we have worked to ensure our service lives up to the legislative requirements before they are formal, legal obligations,” Spiegel said in the statement. “This includes limiting who can communicate with teens to friends and contacts only, offering in-app parental tools, proactively identifying and removing harmful content, and referring lethal drug content to law enforcement.”
So far, news of the senate hearing and related child-safety lawsuits haven’t caused too much concern to investors in companies like Meta, likely because these firms are unlikely to be financially impacted in the near term. It will take a while for any proposed regulation of these social networking firms to come into effect if it happens at all.
Meta CFO Susan Li previously said in July that there are “broadly speaking, increasing legal and regulatory headwinds in the EU and the US that could significantly impact our business and our financial results.” But so far, those “regularly headwinds” haven’t impacted the company’s sales too much, and investors have been mostly pleased with the company’s cost-cutting efforts that have helped the company’s shares reach a record high earlier in January.