CEOs of Meta, X, TikTok grilled about online child safety at US hearing

CEOs of Meta, X, TikTok grilled about online child safety at US hearing


Parents and lawmakers say leaders are not doing enough to address dangers such as sexual exploitation and bullying.

CEOs of Meta, TikTok,

On Wednesday, the executives testified before the Senate Judiciary Committee amid a wave of anger from parents and lawmakers that companies are not doing enough to thwart online threats to children, such as blocking sex offenders and preventing suicides in teenagers.

“They are responsible for many of the dangers our children face online,” said Dick Durbin, U.S. Senate majority leader and committee chairman, in his opening remarks. “Their design decisions, their failure to adequately invest in trust and safety, their constant pursuit of commitment and profit over basic safety have put our children and grandchildren at risk.”

Durbin cited statistics from the nonprofit National Center for Missing and Exploited Children that showed financial “sextortion,” in which a predator tricks a minor into sending explicit photos and videos, has skyrocketed in the past year.

The committee also played a video in which children spoke about their victimization experiences on social media platforms. “I was sexually exploited on Facebook,” said a child in the video, who appeared in the shadows.

“Mr. Zuckerberg, you and the companies before us, I know you don’t mean it, but you have blood on your hands,” said Senator Lindsey Graham, referring to Mark Zuckerberg, CEO of Meta, the company that owns the company the company is owned by Facebook and Instagram. “You have a product that is killing people.”

Zuckerberg testified alongside X CEO Linda Yaccarino, Snap CEO Evan Spiegel, TikTok CEO Shou Zi Chew and Discord CEO Jason Citron.

Mark Zuckerberg, CEO of Meta, looks at Linda Yaccarino, CEO of [Evelyn Hockstein/Reuters]

Yaccarino of The bill is one of several that address child safety. None of these have become law.

X, formerly Twitter, has been heavily criticized since Elon Musk, CEO of Tesla and SpaceX, bought the platform and relaxed its moderation guidelines. This week, the company blocked searches for pop singer Taylor Swift after fake, sexually explicit images of Swift were shared on the platform.

Wednesday also marked TikTok CEO Chew’s first appearance before U.S. lawmakers since March, when the Chinese-owned short-video app company faced sharp questions, some suggesting the app was harming children’s mental health damage.

“We make careful product design decisions to make our app inhospitable to those who seek to harm teens,” Chew said, adding that TikTok’s community guidelines strictly prohibit anything that “puts teens at risk of exploitation or other harm.” – and we enforce them vigorously.” .

At the hearing, executives praised the security tools in place on their platforms and the work they have done with nonprofits and law enforcement to protect minors.

Ahead of their statement, Meta and X also announced new measures ahead of the heated meeting.

But child health advocates say social media companies have repeatedly failed to protect minors.

“When you’re faced with really important security and privacy decisions, profit shouldn’t be the first factor these companies consider,” said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition that advocates uses safer social media.

“These companies have had the opportunity to do this before. They have not succeeded, so independent regulation must intervene.”



Source link