Music Matters
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

TikTok and others change platforms to protect kids. Advocates say it's just a start

Amid growing concern about children's use of social media, the United Kingdom implemented rules designed to keep kids safer and limit their screen time. The U.S. is weighing similar legislation.
Matt Cardy
/
Getty Images
Amid growing concern about children's use of social media, the United Kingdom implemented rules designed to keep kids safer and limit their screen time. The U.S. is weighing similar legislation.

Social media companies have collectively made nearly 100 tweaks to their platforms to comply with new standards in the United Kingdom to improve online safety for kids. That's according to a new report by the U.S.-based nonprofit Children and Screens: Institute of Digital Media and Child Development.

The U.K.'s Children's Code, or the Age Appropriate Design Code, went into effect in 2020. Social media companies were given a year to comply with the new rules. The changes highlighted in the report are ones that social media companies, including the most popular ones among kids, like TikTok, YouTube, Instagram and Snapchat, have publicized themselves. The changes extend to platforms as they are used in the United States, as well.

The companies are members of the industry group NetChoice, which has been fighting legislation for online safety in the U.S. by filing lawsuits.

The analysis "is a great first step in identifying what changes were required [and] how the companies have started to announce their changes," says Kris Perry, executive director of Children and Screens.

"It's promising that despite the protests of the various platforms, they are actually taking the feedback from [researchers] and, obviously, policymakers," says Mary Alvord, a child and adolescent psychologist and the co-author of a new book, The Action Mindset Workbook for Teens.

The design changes addressed four key areas: 1) youth safety and well-being, 2) privacy, security and data management, 3) age-appropriate design and 4) time management.

For example, there were 44 changes across platforms to improve youth safety and well-being. That included Instagram announcing that it would filter comments considered to be bullying. It is also using machine learning to identify bullying in photos. Similarly, YouTube alerts users when their comments are deemed as offensive, and it detects and removes hate speech.

Similarly, for privacy, security and data management, there were 31 changes across platforms. For example, Instagram says it will notify minors when they are interacting with an adult flagged for suspicious behaviors, and it doesn't allow adults to message minors who are more than two years younger than they are.

The report found 11 changes across platforms to improve time management among minors. For example, autoplay is turned off as a default in YouTube Kids. The default setting for the platform also includes regular reminders to turn off, for kids 13 to 17.

"The default settings would make it easier for them to stop using the device," notes Perry.

"From what we know about the brain and what we know about adolescent development, many of these are the right steps to take to try and reduce harms," says Mitch Prinstein, a neuroscientist at the University of North Carolina at Chapel Hill and chief science officer at the American Psychological Association.

"We don't have data yet to show that they, in fact, are successful at making kids feel safe, comfortable and getting benefits from social media," he adds. "But they're the right first steps."

Research also shows how addictive the platforms' designs are, says Perry. And that is particularly bad for kids' brains, which aren't fully developed yet, adds Prinstein.

"When we look at things like the infinite scroll, that's something that's designed to keep users, including children, engaged for as long as possible," Prinstein says. "But we know that that's not OK for kids. We know that kids' brain development is such that they don't have the fully developed ability to stop themselves from impulsive acts and really to regulate their behaviors."

He's also heartened by some other design tweaks highlighted in the report. "I'm very glad to see that there's a focus on removing dangerous or hateful content," he says. "That's paramount. It's important that we're taking down information that teaches kids how to engage in disordered behavior like cutting or anorexia-like behavior."

The report notes that several U.S. states are also pursuing legislation modeled after the U.K.'s Children's Code. In fact, California passed its own Age-Appropriate Design Code last fall, but a federal judge has temporarily blocked it.

At the federal level, the U.S. Senate is soon expected to vote on a historic bipartisan bill called the Kids Online Safety Act, sponsored by Sen. Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, R-Tenn. The bill would require social media platforms to reduce harm to kids. It's also aiming to "make sure that tech companies are keeping kids' privacy in mind, thinking about ways in which their data can be used," says Prinstein.

But as families wait for lawmakers to pass laws and for social media companies to make changes to their platforms, many are "feeling remarkably helpless," Prinstein says. "It's too big. It's too hard — kids are too attached to these devices."

But parents need to feel empowered to make a difference, he says. "Go out and have conversations with your kids about what they're consuming online and give them an opportunity to feel like they can ask questions along the way." Those conversations can go a long way in improving digital literacy and awareness in kids, so they can use the platforms more safely.

Legislation in the U.S. will likely take a while, he adds. "We don't want kids to suffer in the interim."

Copyright 2024 NPR. To see more, visit https://www.npr.org.

Rhitu Chatterjee is a health correspondent with NPR, with a focus on mental health. In addition to writing about the latest developments in psychology and psychiatry, she reports on the prevalence of different mental illnesses and new developments in treatments.