U.S. Senator Richard Blumenthal (D-CT), Chair of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, delivered opening remarks at a hearing titled “Protecting Kids Online: Snapchat, TikTok, and YouTube.” Witnesses from the three companies appeared at the hearing, with TikTok and Snap appearing before Congress for the first time. The hearing was the fourth in a series of bipartisan hearings spearheaded by Blumenthal and Ranking Member U.S. Senator Marsha Blackburn (R-TN) to inform legislation and prompt action by social media companies to address harms and dangers faced by children online.
“Our hearing with the Facebook whistleblower, Frances Haugen, was a searing indictment along with her documents of a powerful, gigantic corporation that put profits ahead of people — especially our children,” Blumenthal said. “We’re hearing the same stories and reports of the same harms about the tech platforms that are represented here today. I’ve heard from countless parents and medical professionals in Connecticut and elsewhere around the country about the same phenomenon on Snapchat, YouTube, and TikTok. In effect, that business model is the same – more eyeballs means more dollars. Everything that you do is to add users, especially kids, and keep them on your apps for longer.”
Blumenthal emphasized the dangers of algorithms in driving extreme content to children, recounting stories shared by Connecticut constituents, including Nora from Westport “who allowed her 11 year old daughter, Avery, on TikTok because she thought it was ‘just girls dancing.’ Avery wanted to exercise more with the shutdown of school and sports, so like most people, she went online. Nora wrote about the rabbit hole that TikTok and YouTube’s algorithms pulled her daughter into. She began to see ever more extreme videos about weight loss. Avery started exercising compulsively and ate only one meal a day. Her body weight dropped dangerously low and she was diagnosed with anorexia. Avery is now luckily in treatment, but the financial cost of care is an extreme burden and her education has suffered.”
Blumenthal’s office created TikTok and YouTube accounts as teenagers after hearing from other parents who had the same experience. On YouTube, “Like Avery, we watched a few videos about extreme dieting and eating disorders. They were easy to find. YouTube’s recommendation algorithm began to promote extreme dieting and eating disorder videos each time we opened the app,” Blumenthal said. “We also received these recommendations each time we watched other videos. It’s mostly eating disorder content. There was no way out of this rabbit hole.”
Blumenthal stressed that Big Tech must do better, saying: “Big Tech cannot say to parents, ‘You must be the gatekeepers. You must be social media copilots, you must be the app police.’ Because parents should not have to bear that burden alone. We need stronger rules to protect children online, real transparency, real accountability. I want a market where the competition is to protect children, not to exploit them. Not a race to the bottom, but competition for the top.”
Blumenthal concluded his opening remarks with Big Tech’s comparison to Big Tobacco, emphasizing a key difference: “Big Tech is not irredeemably bad, like Big Tobacco. Big Tobacco and the tobacco products when used by the customer as the way the manufacturer intended can actually kill the customer. As Ms. Haugen said, ‘Our goal is not to burn Facebook to the ground, it’s to bring out the best to improve and impose accountability. As she said, we can have social media we enjoy that connects us without tearing apart our democracy, putting our children in danger, and sowing ethnic violence around the world. We can do better.’ And I agree.”
Video of Blumenthal’s opening remarks can be found here, and the transcript is copied below.
U.S. Senator Richard Blumenthal (D-CT): Welcome to this hearing on protecting kids on social media. I thank the ranking member, Senator Blackburn, for being a close partner in this work, as well as Chair Cantwell and Ranking Member Wicker for their support.
I want to note that this is the first time TikTok and Snap have appeared before Congress, so I appreciate you – and YouTube – for your testimony this morning. It means a lot.
Our hearing with the Facebook whistleblower, Frances Haugen, was a searing indictment along with her documents of a powerful, gigantic corporation that put profits ahead of people — especially our children. There has been a deafening drumbeat of continuing disclosures about Facebook. They have deepened America’s concern and outrage and have led to increasing calls for accountability. And there will be accountability. This time is different. Accountability to parents and the public, accountability to Congress, accountability to investors and shareholders, and accountability to the Securities and Exchange Commission and other federal agencies because there is ample, credible evidence to start an investigation here.
But today we are concerned about continuing to educate the American public and ourselves about how we can face this crisis. What we learned from Ms. Haugen’s disclosures and reporting since then is absolutely repugnant and abhorrent about Instagram’s algorithms creating a “perfect storm” in the words of one of Facebook’s own researchers. As that person said, it “exacerbates downward spirals, it’s harmful to teens, it fuels hate and violence, it prioritizes profits over the people it hurts.” In effect the algorithms push emotional and provocative content, toxic content, that amplifies depression, anger, hate, anxiety, because those emotions attract and hook kids and others to their platforms. In effect, the more content and more extreme versions of it are pushed to children who have expressed an interest in online bullying, eating disorders, self-harm, even suicide. And that’s why we now have a drumbeat of demands for accountability along with the drumbeats of disclosure.
But we’re hearing the same stories and reports of the same harms about the tech platforms that are represented here today. I’ve heard from countless parents and medical professionals in Connecticut and elsewhere around the country about the same phenomenon on Snapchat, YouTube, and TikTok. In effect, that business model is the same – more eyeballs means more dollars. Everything that you do is to add users, especially kids, and keep them on your apps for longer.
I understand from your testimony that your defense is we’re not Facebook, we’re different and we’re different from each other. Being different from Facebook is not a defense, that bar is in the gutter, it’s not a defense to say that you are different. What we want is not a race to the bottom, but really a race to the top.
So we’ll want to know from you what specific steps you’re taking that protect children, even if it means forgoing profits. We’re going to want to know what research you’ve done, similar to the studies and data that have been disclosed about Facebook, and we’re going to want to know whether you will support real reform. Not just the tweaks and minor changes that you suggested and accounted in your testimony.
The picture that we’ve seen from an endless stream of videos that is automatically selected by sophisticated algorithms shows that you too drive and strive to find something that teens will like, and then drive more of it to them. If you learn that a teen feels insecure about his or her body, it’s a recipe for disaster because you start out on dieting tips, the algorithm will raise the temperature, will flood that individual with more and more extreme messages and after a while, all of the videos are about eating disorders. It’s the same rabbit hole driving kids down those rabbit holes created by algorithms that leads to dark places and encourages more destructive hate and violence.
We’ve done some listening. I heard from Nora in Westport who allowed her 11 year old daughter, Avery, on TikTok because she thought it was “just girls dancing.” Avery wanted to exercise more with the shutdown of school and sports, so like most people, she went online. Nora wrote about the rabbit hole that TikTok and YouTube’s algorithms pulled her daughter into. She began to see ever more extreme videos about weight loss. Avery started exercising compulsively and ate only one meal a day. Her body weight dropped dangerously low and she was diagnosed with anorexia. Avery is now luckily in treatment, but the financial cost of care is an extreme burden and her education has suffered.
We heard from parents who had the same experience, so we not only listened, but we checked ourselves. On YouTube, my office created account as a teenager. Like Avery, we watched a few videos about extreme dieting and eating disorders. They were easy to find. YouTube’s recommendation algorithm began to promote extreme dieting and eating disorder videos each time we opened the app. These were often videos about teens starving themselves, as you can see from this poster. The red is eating disorder related content, the green is all the other videos. One is before and the other is after the algorithm kicked in. You can see the difference. We also received these recommendations each time we watched other videos. It’s mostly eating disorder content. There was no way out of this rabbit hole.
Another parent in Connecticut wrote to me how their son was fed a constant stream of videos on TikTok related to disordered eating and calorie counting after looking up athletic training. As scientific research has shown, eating disorders and body comparison also significantly affects young men on social media. Young men often feel compelled to bulk up to look a certain way.
Again, I heard about this pressure all too often. We created an account on TikTok. Troublingly, it was easy searching TikTok to go from men’s fitness to steroid use. It took us only one minute, one minute to find TikTok accounts openly promoting and selling illegal steroids. We all know the dangers of steroids, and they are illegal.
All of this research and facts and disclosures send a message to America’s parents. You cannot trust Big Tech with your kids. Parents of America cannot trust these apps with their children, and Big Tech cannot say to parents, “You must be the gatekeepers. You must be social media copilots, you must be the app police.” Because parents should not have to bear that burden alone. We need stronger rules to protect children online, real transparency, real accountability. I want a market where the competition is to protect children, not to exploit them. Not a race to the bottom, but competition for the top.
We have said that this moment is for Big Tech a Big Tobacco moment. And I think there’s a lot of truth to that intention, because it is a moment of reckoning. The fact is that like Big Tobacco, Big Tech has lured teens despite knowing its products can be harmful, it has sought to associate itself with celebrities, fashion, beauty, everything that appeals to young audiences and like Big Tobacco, Facebook hid from parents and the public the substantial evidence that Instagram could have a negative effect on teen health.
But the products are different. Big Tech is not irredeemably bad, like Big Tobacco. Big Tobacco and the tobacco products when used by the customer as the way the manufacturer intended can actually kill the customer. As Ms. Haugen said, “Our goal is not to burn Facebook to the ground, it’s to bring out the best to improve and impose accountability. As she said, we can have social media we enjoy that connects us without tearing apart our democracy, putting our children in danger, and sowing ethnic violence around the world. We can do better.” And I agree.
This press release was sponsored by: