Platforms ranging from Snapchat to TikTok to Instagram have emerged as concerns about the harmful effects of social media on teens continue to grow. We are actively adding new features. We provide safer and age-appropriate services. But change rarely addresses the elephant in the room: an algorithm that pushes endless content that can drag anyone down a toxic rabbit hole, not just teens.
These tools can help some, like blocking strangers from sending messages to your child. But it also shares some deeper flaws, starting with the fact that teens can circumvent restrictions if they lie about their age.The platform also places the burden of enforcement on parents. They also do little or nothing to screen out inappropriate and harmful material served by algorithms that can affect the mental and physical health of teens.
“These platforms are aware that their algorithms may be amplifying harmful content and have not taken steps to stop it. ,” said Irene Ly, privacy counsel for the nonprofit Common Sense Media. The more teens keep scrolling, the more engaged they become. And the higher the engagement, the more profitable the platform will be, she says. She said, "I don't think they have much incentive to change that."For example, Snapchat introduced new parental controls in what it calls "Family Center" on Tuesday. It's a tool that allows a parent to see who her teenage kids are messaging, but not the content of the messages themselves. One gotcha: Both parents and their children must opt-in to the service.
Her Nona Farahnik Yadegar, Snap's platform policy and social impact director, likens this to parents wanting to know who their kids are going out with.
When children go out to a friend's house or meet at her shopping mall, parents usually ask: how do you know them According to her, the new tool aims to provide "the insight parents really want to have these conversations with their teens while maintaining their teens' privacy and autonomy." I'm doing it.
Experts agree that these conversations are important. In an ideal world, parents would regularly sit down with their children and have candid discussions about the dangers and pitfalls of social media and the online world.
But many children use a bewildering variety of platforms, all of which are constantly evolving. So the odds are higher for parents who are expected to master and oversee the controls of multiple platforms, said Josh Golin, her director of digital for kids. Advocacy group Fairplay.
"It would be much better to ask the platform to make it more secure by design and by default, rather than increasing the workload of its already overburdened parents." excellent," he said.
According to Golin, the new controls also fail to address the myriad of existing problems with Snapchat. These still claim Snapchat's fame, from a child faking his age, to "compulsive use" facilitated by his Snapstreak feature in the app, to cyberbullying made easy by disappearing messages.
Faranik Jadeger said Snapchat has "strong measures" to stop children from falsely claiming to be over the age of 13. Her teenager, who is over 13 but pretends to be older, is given her one chance to correct her age.
Detecting such lies is not easy, but the platform has several ways to discover the truth. For example, if most of a user's friends are in his early teens, even if he said he was born in 1968 when he signed up, he might be in his teens too. Companies use artificial intelligence to look for age discrepancies. A person's curiosity may reveal their actual age. Farahnik Yadegar also said that even if parents try to enable Parental Her Controls, if they decide that their teenage children are not covered, parents may find that their children fiddle with their date of birth. I pointed out that you might notice.
The child's safety and his teen's mental health are at the forefront of both Democrats and Republicans' criticisms of tech companies. States that have been more active in regulating tech companies than the federal government are now turning their attention to the issue. In March, state attorneys general launched a nationwide investigation into TikTok and its possible harmful effects on the mental health of young people.
TikTok is the most popular social app in the United States. Teens said they are using the video-sharing platform, according to a new report released Wednesday by the Pew Research Center.The company said it focused on age-appropriate experiences. Note that some features, such as direct messages, are not available to younger users.Features, such as time management tools, can help young people and parents manage how much time they spend and what they see in the app. Said it helps. But critics point out that such controls are leaky at best.
"It's really easy for kids to try to get past these features and try to do it on their own," said her Ly of Common Sense Media.
Instagram, owned by Facebook's parent company Meta, is her second most popular app for teens, with 62% saying they use it, followed by he Snapchat followed her with 59%. Not surprisingly, only 32% of her teens reported having used Facebook, down from 71% in 2014 and 2015.
Last fall, Frances Haugen, a former Facebook employee turned whistleblower, said the social network's attention-grabbing algorithms were the reason why teens using Instagram We published an internal study that concluded that it contributes to mental health and emotional problems. , especially girls. Her that her revelation brought about some changes. For example, Meta scrapped her plans for an Instagram version aimed at children under 13. The company also introduced new Parental Her Controls and Her Teen Wellbeing features, such as prompting teen users to take a break if they scroll too much.
Such solutions "are sort of solving the problem, but they basically work around the problem and don't get to its root cause," Ly said. Stated.