Senate hearing on protecting kids online with Instagram’s Adam Mosseri
This week the US Senate continued to shine a spotlight on social media’s dangers to kids by inviting the head of Instagram, Adam Mosseri, to testify about Instagram’s impacts on young users, its commitment to reform and potential legislative solutions.
“Too little, too late” was a sentiment that came up again and again from Senators at the Protecting Kids Online: Instagram and Reforms for Young Users hearing, as they listened to Instagram’s attempts to add long-overdue safety features and protect kids from the harms its own people have documented.
Parents should be deeply concerned by Instagram’s pride in their proposed safety features for teens for two reasons.
Instagram is 11 years old and they’re just now realizing that their teens need robust safety standards. Mr. Mosseri promised parental controls will be live in March of 2022 and seemed to blame us, the people that shut down his Instagram for Kids idea, for the dangers on Instagram. Several times he said all the parental features parents needed were going to be included in Instagram for Kids, but then people pushed back against it and so now he’s migrating those parental controls into regular Instagram. Not once did he say that Instagram for Kids was for an entirely different market - kids 8-12 years old- who SHOULD be playing outside and not scrolling social media. Not once did he realize that he was throwing teens to the wolves, by not including parental controls features in regular Instagram for young users 13-17.
Many of the safety features aren’t live yet and will be optional, not mandatory. Designing safety features but letting kids have the ability to turn them off is like installing seatbelts in cars to protect kids, but not requiring them use them. Ridiculous! Parents need to know that kids are safe on social media platforms. Period.
Senator’s test account on Instagram lead to insightful questions
Kudos to several Senators who set up fake teen Instagram accounts to see what it’s like out there, as they prepared for this hearing. One compelling story came from Senator Lee (Utah) who set up an account for a 13 year old girl. The “Explore Page”, the area that suggests what else you might like on Instagram, was filled with normal content for a 13 year old - funny videos, hair styles and trends. Then they started following a super famous female celebrity account and noticed the Explore page went toxic. All of a sudden it was filled with plastic surgery and extreme dieting. Senator Lee wanted to understand the algorithm behind Instagram that could take a young teen from appropriate to dangerous content just by following one account.
What the head of Instagram thinks we need next
Mr. Mosseri agreed with Senators that there should be a group that oversees safety for kids on social media and suggested the industry as a whole should change. What was lacking was his commitment to an INDEPENDENT body to oversee these standards. What we need is child development experts, educators, teens, online safety experts, parents who have to use these complicated parental controls - a group of people with diverse backgrounds who are eager to see progress.
He cited three things this group needs to tackle.
How to verify age of young users who don’t have government issued identification cards. Kids under 13 years old on social media is a big problem and one that hasn’t been addressed effectively. Technically, social media companies aren’t supposed to let anyone under 13 on their platforms under an outdated rule from 1998 called the Children's Online Privacy Protection Act (COPPA). COPPA restricts websites from tracking data on children under 13, which is why most apps do not want kids younger than 13 to join. It’s not that 13 is a magic number and teens who are 13 are instantly ready for social media. It’s the business model. Social media companies can’t make money as easily from 12 year olds because of COPPA, so they aren’t welcome there. There was a report out of University of Michigan recently that said a third of 7 to 9 year olds and half of 10-12 year olds were already on social media. As far as solutions to the age verification problem, we can look to the UK, which is already working through solutions. We can figure this out.
How to give teens an age appropriate experience. Teens need a safe, fun experience. The adult stuff should be kept far away from them.
How to set up parental controls. I’d like to add “effective” parental controls. Controls that cannot be turned off by a teen. Controls that are easy to set up and easy to use. Controls that actually work.
We agree with these priorities and would add that an independent group needs to oversee this and federal guidelines through regulation needs to be a part of this.
Instagram’s new “take a break” feature
Just a day before the hearing, Instagram released a new “Take a break” tool, which encourages users to stop scrolling and spend some time away from the platform, after they've been scrolling for a certain period. Users need to turn the feature on (it’s not a default) by going to “Settings” and selecting if they want to be alerted after using the platform for 10 minutes, 20 minutes or 30 minutes. When their time is up, they will get a full-screen alert telling them to close out of the app, suggesting they take a deep breath, write something down, check a to-do list or listen to a song.
Will it change anyone’s behavior? It’s doubtful.
Here are my thoughts on this feature on KOAT News.
Parental controls are coming… March 2022
Mr. Mosseri says parental controls planned will allow parents and guardians to see how much time their teenagers spend on Instagram, set time limits, and get notified if their child reports someone (if the teen agrees.) (Again, there’s that pesky non-feature that can be turned off by the teen).
In other future plans, Instagram also says it’s developing features that will stop people from tagging or mentioning teens that don’t follow them, a “nudge” feature that will nudge young users to other things if they have been focused on a potentially harmful topic for a while and be stricter about what posts, hashtags and accounts it recommends to try to cut down on potentially harmful or sensitive content.