sexual predator. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are grappling with on social media, but children’s advocates and lawmakers say companies aren’t doing enough to protect young people. ing.
Watch the event in the player above.
The CEOs of social media companies including Meta, TikTok and X testified before the Senate Judiciary Committee on Wednesday, as lawmakers and parents grow increasingly concerned about the impact of social media on young people’s lives.
The hearing began with recorded testimony from children and parents who said they or their children had been exploited on social media. Throughout the several-hour event, parents who lost children to suicide silently held up photos of their deceased children.
“Children are responsible for many of the dangers they face online,” Senate Majority Whip Dick Durbin, the committee’s chairman, said in his opening remarks. āTheir design choices, failure to properly invest in trust and safety, and constant pursuit of engagement and profit over basic safety are all putting our children and grandchildren at risk.ā
During a heated Q&A with Mark Zuckerberg, Republican Missouri Sen. Josh Hawley asked Mehta CEO whether he had personally compensated victims and their families for what they went through.
Zuckerberg replied, “I don’t think so.”
“The families of the victims are here,” Hawley said. “Do you want to apologize to them?”
Parents attending the hearing stood up and held up pictures of their children. Mr. Zuckerberg stood as well, looking away from the microphone and the senators and attempting to address him directly.
clock: State sues Meta, accusing it of manipulating app to poison children
“We’re sorry for everything you’ve been through. No one should have to go through what your families have suffered,” he said, adding that Meta continues to support “industry-wide efforts to protect children.” ”, he added.
But child advocates and parents have repeatedly stressed that no company is doing enough.
“Meta’s general approach is, ‘Trust us, we’ll do the right thing,’ but how can you trust Meta? The way they talk about these issues… , it feels like they’re trying to gaslight the world,” said former engineering director Arturo Bejar. The social media giant, known for its expertise in curbing online harassment, recently testified before Congress about child safety on Meta’s platform. āAfraid of being old enough to participateā media. “
Mr. Hawley continued to press Mr. Zuckerberg, asking whether he would take personal responsibility for the damage his company caused. Zuckerberg continued his message, reiterating that his meta job is to “build industry-leading tools” and empower his parents.
“It’s about making money,” Holly interjected.
South Carolina Sen. Lindsey Graham, the top Republican on the Judiciary Committee, echoed Durbin’s sentiments and said he was ready to work with Democrats to resolve the issue.
“After years of working on this issue with you and others, I have come to the following conclusion: Social media companies, as currently designed and operated, are dangerous products,” Graham said. Told.
He told executives that his company’s platform had enriched lives, but now it was time to address the “dark side.”
read more: Thousands of fake Facebook accounts shut down by Meta were poised to polarize voters heading into 2024
Discord’s Jason Citron and other executives touted the platform’s existing safety tools and work it has done with nonprofits and law enforcement to protect minors.
Snapchat has thrown its weight behind the hearing ahead of the hearing, starting to support a federal bill that would impose legal liability on apps and social platforms that promote content harmful to minors. Snap CEO Evan Spiegel on Wednesday reiterated his support for the company and called on the industry to support the bill.
TikTok CEO Shou Zi Chew said TikTok is wary of enforcing policies that prohibit children under 13 from using the app. CEO Linda Yaccarino said X (formerly Twitter) is not for children.
“We don’t have a dedicated children’s line of business,” Yaccarino said. He said the company also supports the Stop CSAM Act, a federal bill that would make it easier for victims of child exploitation to sue tech companies.
But child health advocates say social media companies repeatedly fail to protect minors.
“When faced with really important decisions about safety and privacy, the bottom line should not be the first factor these companies consider,” said Co-Chairman of Design It For Us, a youth-led organization. said Zaman Qureshi. A coalition advocating for safer social media. āThese companies had an opportunity to do this before they failed, so independent regulation needs to step in.ā
Republican and Democratic senators showed unusual agreement during hearings to pass bills such as the Kids Online Safety Act of 2022, proposed by Connecticut Sen. Richard Blumenthal and Sen. It’s not yet clear whether that’s enough. Marcia Blackburn of Tennessee;
Meta is facing lawsuits from dozens of states alleging it intentionally designed features on Instagram and Facebook to addict children to its platforms and failed to protect children from online predators. has been done.
New internal emails between Meta executives released by Blumenthal’s office show global president Nick Clegg and others telling Zuckerberg, “The entire company It has been shown that the company is requesting to hire more people to enhance the welfare benefits of employees.
āFrom a policy perspective, this work has gained increasing urgency in recent months. Politicians in the US, UK, EU and Australia have publicly and privately raised concerns about the impact of our products on the mental health of young people. ,ā Clegg wrote in an August paper. 2021 email.
Emails released by Blumenthal’s office appear to have received no response, if any, from Zuckerberg. In September 2021, the Wall Street Journal published a report, āThe Facebook Files,ā based on internal documents from whistleblower Francis Haugen, who later testified before the Senate.
Meta has been ramping up its child safety features in recent weeks, and earlier this month it removed inappropriate content from teenagers’ Instagram and Facebook accounts, including posts about suicide, self-harm and eating disorders. announced that it would start hiding content. It also restricts minors from receiving messages from people they don’t follow or connect with on Instagram or Messenger, and prevents teens from viewing Instagram videos and messages late at night. A new “nudge” has also been added to discourage this. Nudges prompt kids to close apps, but don’t force them.
Google’s YouTube was conspicuously absent from the list of companies convened before the Senate on Wednesday, even though more children use YouTube than any other platform, according to the Pew Research Center. That’s what it means. According to Pew research, 93 percent of American teens use YouTube, with TikTok a close second at 63 percent.
Associated Press writer Mary Claire Jalonick contributed to this article.