sexual predator. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Bullying.
These are just some of the issues young people are grappling with on social media, but children’s advocates and lawmakers say companies aren’t doing enough to protect young people. ing.
This morning, the CEOs of Meta, TikTok, X, and other social media companies testified before the Senate Judiciary Committee as lawmakers and parents grow increasingly concerned about the impact of social media on young people’s lives.
The hearing began with recorded testimony from children and parents who said they or their children had been exploited on social media. Throughout the several-hour event, parents who lost children to suicide silently held up photos of their deceased children.
“Children are responsible for many of the dangers they face online,” Senate Majority Whip Dick Durbin, the committee’s chairman, said in his opening remarks. āTheir design choices, failure to properly invest in trust and safety, and constant pursuit of involvement and profit over basic safety are all putting our children and grandchildren at risk.ā
During a heated Q&A with Mark Zuckerberg, Republican Missouri Sen. Josh Hawley asked Mehta CEO whether he had personally compensated victims and their families for what they went through.
Zuckerberg replied, “I don’t think so.”
“The families of the victims are here,” Hawley said. āDo you want to apologize to them?ā
As the parents stood up and held up pictures of their children, Mr. Zuckerberg turned to them and apologized for what they had gone through.
Mr. Hawley continued to press Mr. Zuckerberg, asking whether he would take personal responsibility for the damage his company caused. Zuckerberg continued his message, reiterating that his meta job is to “build industry-leading tools” and empower his parents.
“It’s about making money,” Holly interjected.
South Carolina Sen. Lindsey Graham, the top Republican on the Judiciary Committee, echoed Durbin’s sentiments and said he was ready to work with Democrats to resolve the issue.
“After years of working on this issue with you and others, I have come to the following conclusion: Social media companies, as currently designed and operated, are dangerous products,” Graham said. Told.
He told executives that his company’s platform had enriched lives, but it was time to address the “dark side.”
Discord’s Jason Citron and other executives touted the platform’s existing safety tools and work it has done with nonprofits and law enforcement to protect minors.
Snapchat has thrown its weight behind the hearing ahead of the hearing, starting to support a federal bill that would impose legal liability on apps and social platforms that promote content harmful to minors. Snap CEO Evan Spiegel today reiterated his support for the company and called on the industry to support the bill.
TikTok CEO Shou Zi Chew said TikTok is wary of enforcing policies that prohibit children under 13 from using the app. CEO Linda Yaccarino said X (formerly Twitter) is not for children.
“We don’t have a dedicated children’s line of business,” Yaccarino said. He said the company also supports the Stop CSAM Act, a federal bill that would make it easier for victims of child exploitation to sue tech companies.
But child health advocates say social media companies repeatedly fail to protect minors.
“When faced with really important decisions about safety and privacy, the bottom line should not be the first factor these companies consider,” said Co-Chairman of Design It For Us, a youth-led organization. said Zaman Qureshi. A coalition advocating for safer social media. āThese companies had an opportunity to do this before they failed, so independent regulation needs to step in.ā
Republican and Democratic senators showed unusual unity throughout the hearing.
Meta is facing lawsuits from dozens of states alleging it intentionally designed features on Instagram and Facebook to addict children to its platforms and failed to protect children from online predators. has been done.
Meta has strengthened its child safety features in recent weeks, and earlier this month it removed inappropriate content from teenagers’ Instagram and Facebook accounts, including posts about suicide, self-harm and eating disorders. announced that it would start hiding content.
It also restricts minors from receiving messages from people they don’t follow or connect with on Instagram or Messenger, and prevents teens from viewing Instagram videos and messages late at night. A new “nudge” has also been added to discourage this.
Nudges prompt kids to close apps, but don’t force them.
But child safety advocates say the companies’ measures are not enough.