Senate Hearing on Protecting Kids Online: TikTok, Snapchat and YouTube

Today the Senate Hearing “Protecting Kids Online” turned the spotlight from Facebook and Instagram onto TikTok, Snapchat and YouTube and the content on their platforms. Senate members displayed a sense of urgency and impatience with the platforms.

Senate Subcommittee members came prepared, with several members setting up teen accounts on these platforms to experience what the algorithms would deliver and receiving shocking content on suicide, self-harm, body image and sexual content. They saw first-hand how dangerous these platforms are.

The three Big Tech leaders started strong, saying that they welcomed stronger regulations, transparency and accountability, but gave non-committal answers when asked about specifics and tried to position themselves far from the policies of Facebook and Instagram. It was Snapchat and TikTok’s first time in a hearing, and at times it was comical to hear them say how much “less bad” they were than Facebook.

The tech leaders pointed to new safety measures designed to keep teens safer, many of which were only enacted over the summer, as it became apparent that regulation was a real possibility.

All three had a hard time committing to current proposed legislation, repeatedly saying they liked the goal but needed to have more discussions about the details. And all three overstated the tools parents actually have at their disposal today to keep their kids safe.

Highlights from Jennifer Stout,

Vice President of Global Public Policy,

Snap Inc.

Ms. Stout called Snapchat “the antidote to social media” and differentiated the platform in these ways:

  • requiring a bi-directional friendship (instead of one-way “followers”) which encourages users to actually know each other

  • the app opening to a camera (instead of a news feed) to spur creativity

  • the content created by users deletes by default and isn’t kept long-term.

Sidenote: One of the vexing things about Snapchat is the disappearing messages, combined with their policy to not allow parental monitoring software to send alerts to parents about when their child might be in trouble on their app.

Sidenote #2: The Family Center mentioned today as a tool for parents hasn’t been rolled out yet.

Sidenote #3: Their partnership with third party apps like Yolo and LMK that let teens poll and post anonymously have caused teen suicides and mental anguish.

Highlights from Michael Beckerman,

Vice President and Head of Public Policy, Americas

TikTok

Mr. Beckerman wants us to see TikTok as a place for “joyful, authentic content” and touted new rules on not allowing direct messages for users under 16 years old and making the default account be “private” for under 16 year olds.

Sidenote: Most parents I’ve spoken with were shocked to learn these were NEW rules. They seem like such obvious safety rules. Why hasn’t TikTok always protected kids’s safety and privacy?

Sidenote #2: To use the Family Pairing feature discussed today, a parent needs to set up their own TikTok account and grab your teen’s phone to pair it to digital wellbeing.

Highlights from Leslie Miller,

Vice President, Government Affairs and Public Policy,

YouTube

Ms. Miller wants us to know that:

  • YouTube works with child experts to develop the features in its apps

  • it was proactive in creating YouTube Kids in 2015 so its users under 13 years old could have a safer experience

  • it has voluntarily published the Violative View Rate (VVR)—the % of views that come from content that violates their policies.

Sidenote: “Supervised Experience” on YouTube with the restricted options mentioned today is available for kids under 13 in the U.S. and still in beta testing according to their website. To access it, parents need to set up a Google Family Link account, then link their child’s account to the parent account and change the settings. Once the child turns 13, this solution won’t work.

From their site: “The YouTube experience — managed by you — is for parents who decide their kids are ready to explore the vast universe of YouTube videos. This supervised experience comes with content settings for pre-teens and older, limited features, and features that help build healthy digital habits.”

Senator Blumenthal’s closing remark was strong: “I leave it to the parents of America and the world whether or not they find your answers sufficient.”

Next steps: our recommendations

1. Make social media supervisable by regular parents without an IT degree

Social media platforms need to give parents an easy way to “see” what’s going on online and parent in the online space. Right now it is impossible to get involved as a parent. Parents can’t see who their kids are interacting with, what they are viewing, what is getting delivered to them via the algorithm, where they are exploring, where they are hurting or confused.

Give parents tools to make it easy to supervise their teens online.

2. Create regulations to protect teens’ privacy

Look to the UK’s Age Appropriate Design Code for strong privacy protections for all platforms that teens and children are likely to use. These companies are already following these rules in the UK. Let’s make them apply in the US.

3. Focus on Youth - Take away the Section 230 liability shield for these platforms

Section 230 grants platforms legal immunity for the content posted by their users. The problem is that social media companies elevate certain posts through their algorithm. The algorithms and business decisions from these platforms have caused massive tragedy to US families.

Let’s focus on reforming Section 230 for youth safety and protection. Teens and children have taken their lives through suicide, have accidentally died through online challenges and drug availability, been harmed by predators, learned to self-harm, learned to develop eating disorders, and more.

Social media platforms need to be held accountable for promoting dangerous and deadly content to children and for their algorithms that spread this content.

4. Require platforms to only ask for bare minimum permissions

Several senators asked about TikTok’s current permission to collect biometric data, including face prints, voice prints, as well as geolocation, the objects that appear in your videos and even smart speaker audio. Mr. Beckerman assured them TikTok isn’t currently using biometric data and would ask users’ permission before actually collecting that data.

No!

TikTok shouldn’t be allowed to even ask for this information.

We need to reassess what data platforms really NEED from any of us, especially our kids.

5. Teens shouldn’t be able to turn off safety features inside the apps

We heard that some of the safety settings are now defaulting to “on”. That’s a step in the right direction.

But kids can easily turn safety features off. Why is this allowed?

6. Get kids under 13 off all social media platforms

No, we don’t mean create more platforms for under 13 year olds, like the Facebook proposal for an Instagram for Kids.

We mean age verify the kids on current platforms. Kids under 13 should be allowed to be children, without the pressure to be online socially.


Some legislation discussed today in the Senate Hearing:

  • Kids Internet Design and Safety Act KIDS Act prohibits amplifying harmful content, bans “auto-play” features, bans “nudges” and push alerts, removes the “like” buttons, which quantify popularity or are a sign of rejection.

  • Platform Accountability and Transparency Act PACT Act holds social media companies accountable for content that violates their own policies or is illegal.

  • Children and Media Research Advancement Act CAMRA Act requires the National Institutes of Health to fund research regarding the effects of media on infants, children, and adolescents.

  • Protecting the Information of our Vulnerable Children and Youth Act Kids PRIVCY Act bans targeted ads to teens and children, requires opt-in consent for minors, requires considering the best interests of kids, protects biometric data and other private information and more.

  • 2021 COPPA update prohibits internet companies from collecting personal information from anyone 13- to 15-years old without the user’s consent; creates an online “Eraser Button” by requiring companies to permit users to eliminate personal information from a child or teen; and implements a “Digital Marketing Bill of Rights for Minors” that limits the collection of personal information from teens.

  • Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020 EARN IT Act revises the framework governing the prevention of online sexual exploitation of children.

  • Protect Americans from Dangerous Algorithms Act PADAA Act holds social media companies accountable for using algorithms that promote harmful and dangerous content that leads to offline violence.

Previous
Previous

Senate Hearing on Social Media on The TODAY Show- October 28, 2021

Next
Next

Spotting if your child is being cyberbullied - KOAT News - October 25, 2021