Current track

Title

Artist

Current show

Standard Rotation

9:00 am 12:00 am

Current show

Standard Rotation

9:00 am 12:00 am


Meta, Tiktok and Other Social Media CEOs Testify in Heated Senate Hearing on Child Exploitation

Written by on February 1, 2024

Sexual predators. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media — and children’s advocates and lawmakers say companies are not doing enough to protect them.

On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify at a time when lawmakers and parents are growing increasingly concerned about the effects of social media on young people’s lives.

The hearing began with recorded testimony from kids and parents who said they or their children were exploited on social media. Throughout the hourslong event, parents who lost children to suicide silently held up pictures of their dead kids.

“They’re responsible for many of the dangers our children face online,” Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. “Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”

In a heated question and answer session with Mark Zuckerberg, Republican Missouri Sen. Josh Hawley asked the Meta CEO if he has personally compensated any of the victims and their families for what they have been through.

“I don’t think so,” Zuckerberg replied.

“There’s families of victims here,” Hawley said. “Would you like to apologize to them?”

Zuckerberg stood, turned away from his microphone and the senators, and directly addressed the parents in the gallery.

Meta CEO Mark Zuckerberg turns to address the audience during a Senate Judiciary Committee hearing on Capitol Hill in Washington, Jan. 31, 2024, to discuss child safety. X CEO Linda Yaccarino watches at left.
Meta CEO Mark Zuckerberg turns to address the audience during a Senate Judiciary Committee hearing on Capitol Hill in Washington, Jan. 31, 2024, to discuss child safety. X CEO Linda Yaccarino watches at left.

“I’m sorry for everything you have all been through. No one should go through the things that your families have suffered,” he said, adding that Meta continues to invest and work on “industrywide efforts” to protect children.

But time and time again, children’s advocates and parents have stressed that none of the companies are doing enough.

One of the parents who attended the hearing was Neveen Radwan, whose teenage daughter got sucked in to a “black hole of dangerous content” on TikTok and Instagram after she started looking at videos on healthy eating and exercise at the onset of the COVID lockdowns. She developed anorexia within a few months and nearly died, Radwan recalled.

“Nothing that was said today was different than what we expected,” Radwan said. “It was a lot of promises and a lot of, quite honestly, a lot of talk without them really saying anything. The apology that he made, while it was appreciated, it was a little bit too little, too late, of course.”

But Radwan, whose daughter is now 19 and in college, said she felt a “significant shift” in the energy as she sat through the hearing, listening to the senators grill the social media CEOs in tense exchanges.

“The energy in the room was, very, very palpable. Just by our presence there, I think it was very noticeable how our presence was affecting the senators,” she said.

Hawley continued to press Zuckerberg, asking if he’d take personal responsibility for the harms his company has caused. Zuckerberg stayed on message and repeated that Meta’s job is to “build industry-leading tools” and empower parents.

“To make money,” Hawley cut in.

South Carolina Sen. Lindsay Graham, the top Republican on the Judiciary panel, echoed Durbin’s sentiments and said he’s prepared to work with Democrats to solve the issue.

“After years of working on this issue with you and others, I’ve come to conclude the following: Social media companies as they’re currently designed and operate are dangerous products,” Graham said.

The executives touted existing safety tools on their platforms and the work they’ve done with nonprofits and law enforcement to protect minors.

Snapchat broke ranks ahead of the hearing and is backing a federal bill that would create a legal liability for apps and social platforms that recommend harmful content to minors. Snap CEO Evan Spiegel reiterated the company’s support on Wednesday and asked the industry to back the bill.

TikTok CEO Shou Zi Chew said the company is vigilant about enforcing its policy barring children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, doesn’t cater to children.

“We do not have a line of business dedicated to children,” Yaccarino said. She said the company will also support Stop CSAM Act, a federal bill that makes it easier for victims of child exploitation to sue tech companies.

Yet child health advocates say social media companies have failed repeatedly to protect minors.

Profits should not be the primary concern when companies are faced with safety and privacy decisions, said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media. “These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in.”

Protesters stand up during a Senate Judiciary Committee hearing with social medial platform heads on Capitol Hill in Washington, Jan. 31, 2024, to discuss child safety.
Protesters stand up during a Senate Judiciary Committee hearing with social medial platform heads on Capitol Hill in Washington, Jan. 31, 2024, to discuss child safety.

Republican and Democratic senators came together in a rare show of agreement throughout the hearing, though it’s not yet clear if this will be enough to pass legislation such as the Kids Online Safety Act, proposed in 2022 by Sens. Richard Blumenthal of Connecticut and Marsha Blackburn of Tennessee.

“There is pretty clearly a bipartisan consensus that the status quo isn’t working,” said New Mexico Attorney General Raúl Torrez, a Democrat. “When it comes to how these companies have failed to prioritize the safety of children, there’s clearly a sense of frustration on both sides of the aisle.”

Meta is being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that addict children to its platforms. New Mexico filed a separate lawsuit saying the company has failed to protect them from online predators.

New internal emails between Meta executives released by Blumenthal’s office show Nick Clegg, the company’s president of global affairs, and others asking Zuckerberg to hire more people to strengthen “wellbeing across the company” as concerns grew about effects on youth mental health.

“From a policy perspective, this work has become increasingly urgent over recent months. Politicians in the U.S., U.K., E.U. and Australia are publicly and privately expressing concerns about the impact of our products on young people’s mental health,” Clegg wrote in an August 2021 email.

The emails released by Blumenthal’s office don’t appear to include a response, if there was any, from Zuckerberg.

In September 2021, The Wall Street Journal released the Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate. Clegg followed up on the August email in November with a scaled-down proposal but it does not appear that anything was approved.

“I’ve spoken to many of the parents at the hearing. The harm their children experienced, all that loss of innocent life, is eminently preventable. When Mark says ‘Our job is building the best tools we can,’ that is just not true,” said Arturo Béjar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta’s platforms.

“They know how much harm teens are experiencing, yet they won’t commit to reducing it, and most importantly to be transparent about it. They have the infrastructure to do it, the research, the people, it is a matter of prioritization.”

Béjar said the emails and Zuckerberg’s testimony show that Meta and its CEO “do not care about the harm teens experience” on their platforms.

“Nick Clegg writes about profound gaps with addiction, self-harm, bullying and harassment to Mark. Mark did not respond, and those gaps are unaddressed today. Clegg asked for 84 engineers of 30,000,” Béjar said. “Children are not his priority.

 

By Associated Press


Reader's opinions

Leave a Reply

Your email address will not be published. Required fields are marked *