Protecting kids online is a bipartisan cause for senators

Lawmaker accuse Big Tech of blocking regulation.

February 15, 2023, 10:47 AM

Democrats and Republicans joined together on Tuesday in a renewed push to pass federal legislation aimed at safeguarding young people using the internet, as a mother testified her son was driven to suicide by bullying on social media.

With rare bipartisan unity, lawmakers on the Senate Judiciary Committee used a hearing to blame social media companies -- "Big Tech" -- for blocking attempts by Congress to regulate their platforms.

"I don't know if any or all of you realize what you witnessed today. But this Judiciary Committee crosses the political spectrum, not just from Democrats to Republicans, but from real progressives to real conservatives. And what you heard was the unanimity of purpose," the panel's chair, Sen Dick Durbin, D-Ill., said at the end of the hearing -- the first time the 118th Congress has addressed the topic following years of policy proposals hitting snags.

PHOTO: Senate Judiciary Committee Chairman Dick Durbin takes part in the Senate Judiciary Committee hearing on President Joe Biden's judicial nominees on Capitol Hill in Washington, Jan. 25, 2023.
Senate Judiciary Committee Chairman Dick Durbin takes part in the Senate Judiciary Committee hearing on President Joe Biden's judicial nominees on Capitol Hill in Washington, Jan. 25, 2023.
Leah Millis/Reuters, FILE

Durbin said committee members would consider the many bills aimed at tech regulation amid the privacy, sexual exploitation and mental health crises among young Americans.

The hearing, called by Durbin and the committee's top Republican, Lindsey Graham of South Carolina, comes after President Joe Biden renewed calls for more restrictions on Big Tech in his State of the Union address.

"And it's time to pass bipartisan legislation to stop Big Tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data these companies collect on all of us," Biden said last week.

PHOTO: Kristin Bride, Survivor Parent and Social Media Reform Advocate, sits next to a photo of her 16-year-old son Carson as she appears before a Senate Committee on the Judiciary hearing to examine protecting our children online in DC, Feb. 14, 2023.
Kristin Bride, Survivor Parent and Social Media Reform Advocate, sits next to a photo of her 16-year-old son Carson Bride as she appears before a Senate Committee on the Judiciary hearing to examine protecting our children online, in the Hart Senate Office Building in Washington, DC, Feb. 14, 2023.
Rod Lamkey/CNP/Polaris

While the committee heard testimony from child safety advocates and industry leaders, including the president of the National Center for Missing and Exploited Children and the chief science officer of the American Psychological Association, it did not hear witnesses from tech companies.

One witness, Kristin Bride of Portland, Oregon, detailed how she became an activist after losing her teenage son, Carson, to suicide in 2020 after she said he was the target of floods of anonymous cyberbullying messages on Snapchat.

She described one summer night when her son was 16, when he excitedly shared with his family his day as a new pizza shop employee.

PHOTO: Sen. Josh Hawley (R-MO) listens during the Senate Judiciary Committee hearing on President Joe Biden's™s judicial nominees on Capitol Hill in Washington, Jan. 25, 2023.
Sen. Josh Hawley (R-MO) listens during the Senate Judiciary Committee hearing on President Joe Biden's™s judicial nominees on Capitol Hill in Washington, Jan. 25, 2023.
Leah Millis/Reuters

"The next morning, I woke to the complete shock and horror that Carson had hung himself in our garage while we slept," Bride said, mentioning that investigators later found evidence of the cyberbullying on his cell phone.

Bride has made multiple trips to Washington to testify in hopes of moving the needle on tech regulation for children, specifically the Kids' Online Safety Act, being pushed by Democratic Sen. Richard Blumenthal of Connecticut and Republican Sen. Marsha Blackburn of Tennessee.

The measure requires that social media platforms provide minors with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations, gives parents controls to monitor harmful behaviors and places a responsibility on the platforms to prevent and mitigate harms to minors, such as promotion of self-harm, suicide, eating disorders, substance abuse, sexual exploitation, among other provisions.

On Tuesday, Bride was asked how she felt about repeatedly reliving her experience.

"It is so difficult to tell our stories of the very worst day of our lives over and over and over again and then not see change … we really are looking to call for action, and I am confident that you can all come together and do this for us and for America's children."

In addition to the 2023 Kids' Online Safety Act, the dozens of proposals to restrict Big Tech in Congress include a measure from Graham and Democratic Sen. Elizabeth Warren of Massachusetts that would create a new consumer protection agency to regulate the tech industry and a proposal from committee member GOP Sen. Josh Hawley of Missouri to raise the social media age requirement to 16.

Durbin said during the hearing that he would be he is circulating a discussion draft of a bill he has long worked on to curb the spread of child sexual abuse material.

"There are more bills being introduced in this area than any subject matter that I know of. All of them are bipartisan," said Blumenthal.

Bride, whose lawsuit against Snapchat over what her son's death was dismissed, said social media companies have long been shielded by a no liability provision for third-party content in Section 230 of the Communications Decency Act.

"It is almost as if these social media platforms are operating in the days of the Wild West," Blackburn said Tuesday.

"It's big tobacco's ... playbook all over again," Blumenthal said.

In a statement provided to ABC News, a spokesperson for Snap Inc. -- the parent company of Snapchat -- said it's "constantly evaluating how we continue to make our platform safer, including through new education, features and protections."

"Nothing is more important to us than the wellbeing of our community. At Snapchat, we curate content from known creators and publishers and use human moderation to review user generated content before it can reach a large audience, which greatly reduces the spread and discovery of harmful content. We also work closely with leading mental health organizations to provide in-app tools for Snapchatters and resources to help support both themselves and their friends."