CEOs from Meta, TikTok, Snap, X and Discord head to Congress for kids' online safety hearing

US Capitol building

Image Credits: Bryce Durbin/TechCrunch

CEOs from some of the biggest social platforms will appear before Congress on Wednesday to defend their companies against mounting criticism that they have done too little to protect kids and teens online.

The hearing, set to begin at 10 a.m. ET, is the latest in a long string of congressional tech hearings stretching back for years, with little in the way of new regulation or policy change to show for the efforts.

The Senate Judiciary Committee will host the latest hearing, which is notable mostly for dragging five chief executives across the country to face a barrage of questions from lawmakers. Tech companies often placate Congress by sending legal counsel or a policy executive, but the latest hearing will feature a slate of CEOs: Meta’s Mark Zuckerberg, X (formerly Twitter) CEO Linda Yaccarino, TikTok’s ​​Shou Chew, Discord’s Jason Citron and Evan Spiegel of Snap. Zuckerberg and Chew are the only executives who agreed to appear at the hearing voluntarily without a subpoena.

While Zuckerberg is a veteran of these often lengthy, meandering attempts to hold tech companies to account, Wednesday’s televised hearing will be a first for Yaccarino, Spiegel and Citron. Snap and X have sent other executives (or their former chief executive) in the past, but Discord — a chat app originally designed for gamers — is making its first appearance in the hot seat. All three first-timers could produce some interesting off-script moments, particularly Yaccarino. In recent interviews as X’s top executive, Elon Musk’s pick to lead the company has appeared flustered and combative — a world apart from her media overtrained peers like Zuckerberg and Chew.

Discord is a very popular app among young people, but it’s still an unusual name to come up in one of these hearings. The committee’s decision to include Discord is likely a result of a report last year from NBC News exploring sextortion and child sexual abuse material (CSAM) on the chat platform. The company’s inclusion is notable, particularly in light of the absence of more prominent algorithm-powered social networks like YouTube — often inexplicably absent from these events — and the absence of Amazon-owned livestreaming giant Twitch.

Wednesday’s hearing, titled “Big Tech and the Online Child Sexual Exploitation Crisis,” will cover much more ground than its narrow naming would suggest. Lawmakers will likely dig into an array of concerns — both recent and ongoing — about how social platforms fail to protect their young users from harmful content. That includes serious concerns around Instagram openly connecting sexual predators with sellers advertising CSAM, as the WSJ previously reported, and the NBC News investigation revealing that Discord has facilitated dozens of instances of grooming, kidnapping and other instances of sexual exploitation in recent years.

Beyond concerns that social platforms don’t do enough to protect kids from sexual predation, expect lawmakers to press the five tech CEOs on other online safety concerns, like fentanyl sellers on Snapchat, booming white supremacist extremism on X and the prevalence of self harm and suicide content on TikTok. And given the timing of X’s embarrassing failure to prevent a recent explosion of explicit AI-generated Taylor Swift imagery and the company’s amateurish response, expect some Taylor Swift questions too.

The tech companies are likely to push back, pointing lawmakers to platform and policy changes in some cases designed to make these apps safer, and in others engineered mostly to placate Congress in time for this hearing. In Meta’s case, that looks like an update to Instagram and Facebook last week that prevents teens from receiving direct messages from users they don’t know. Like many of these changes from companies like Meta, it raises the question of why these safeguards continue to be added on the fly instead of being built into the product before it was offered to young users.

KOSA looms large

This time around, the hearing is part of a concerted push to pass the Kids Online Safety Act (KOSA), a controversial piece of legislation that ostensibly forces tech platforms to take additional measures to shield children from harmful content online. In spite of some revisions, the bill’s myriad critics caution that KOSA would aggressively sanitize the internet, promote censorship and imperil young LGBTQ people in the process. Some of the bill’s conservative supporters — including co-sponsor Sen. Marsha Blackburn — have stated outright that KOSA should be used to effectively erase transgender content for young people online.

The LGBTQ advocacy group GLAAD expressed its concerns about the hearing and related legislation in a statement provided to TechCrunch, urging lawmakers to ensure that “proposed solutions be carefully crafted” to avoid negatively impacting the queer community.

“The US Senate Judiciary Committee’s hearing is likely to feature anti-LGBTQ lawmakers baselessly attempting to equate age-appropriate LGBTQ resources and content with inappropriate material,” GLAAD said. “… Parents and youth do need action to address Big Tech platforms’ harmful business practices, but age-appropriate information about the existence of LGBTQ people should not be grouped in with such content.”

The ACLU and digital rights organization the EFF have also opposed the legislation, as have other groups concerned about the bill’s implications for encryption. Similar concerns have followed the Children and Teens’ Online Privacy Protection Act (now known as “COPPA 2.0“), the STOP CSAM Act and the EARN IT Act, adjacent bills purporting to protect children online.

The bill’s proponents aren’t all conservative. KOSA enjoys bipartisan support at the moment and the misgivings expressed by its critics haven’t broken through to the many Democratic lawmakers who are on board. The bill is also backed by organizations that promote children’s safety online, including the American Academy of Pediatrics, the National Center on Sexual Exploitation and Fairplay, a nonprofit focused on protecting kids online.

“KOSA is a needed corrective to social media platforms’ toxic business model, which relies on maximizing engagement by any means necessary, including sending kids down deadly rabbit holes and implementing features that make young people vulnerable to exploitation and abuse,” Josh Golin, executive director of Fairplay, said in a statement provided to TechCrunch. Fairplay has also organized a pro-KOSA coalition of parents who have lost children in connection with cyberbullying, drugs purchased on social platforms and other online harms.

As of last week, KOSA’s unlikeliest supporter is one of the companies that the bill seeks to regulate. Snap split from its peers last week to throw its support behind KOSA, a move likely intended to endear the company to regulators that could steer its fate — or perhaps more importantly, the fate of TikTok, Snap’s dominant rival, which sucks up the lion’s share of screen time among young people.

Snap’s decision to break rank with its tech peers and even its own industry group on KOSA echoes a similar move by Meta, then Facebook, to support a controversial pair of laws known as FOSTA-SESTA back in 2018. That legislation, touted as a solution to online sex trafficking, went on to become law, but years later FOSTA-SESTA is better known for driving sex workers away from safe online spaces than it is for disrupting sex trafficking.

Fan fiction writers rally fandoms against KOSA, the bill purporting to protect kids online

admin

Leave a Reply

Your email address will not be published. Required fields are marked *