As concerns grow about the harmful effects that social media can have on teenagersplatforms like Snapchat, TikTok and Instagram they incorporate measures that they say will make those services safer and more age-appropriate.
But those novelties rarely address the underlying problem: the algorithms that generate content that can drag anyone, not just teenagers, to dangerous sites.
The tools in question help to some extent. For example, they prevent strangers from sending messages to the boys. But they also share serious shortcomings, starting with the fact that teens can get around restrictions by lying about their age.
Platforms, on the other hand, leave surveillance up to parents. And they do little or nothing to veto inappropriate or harmful material generated by algorithms that can affect the mental and physical well-being of teens.
“These platforms know that their algorithms can sometimes magnify harmful content, and they don’t take steps to prevent it,” said Irene Ly, a privacy advisor for the nonprofit Common Sense Media.
The more time teens spend online, the more they get hooked, and the more hooked, the more money the platforms make, he said. “I don’t think they have much incentive to change that.”
As an example, consider Snapchat, which on Tuesday introduced new parental controls from what it calls a “Family Center,” a tool that allows parents to see who their children are messaging, even if they don’t. its content. One detail: both parents and children have to sign up for the service.
Nona Farahnik Yadegar, director of platform policy and social impact at Snap, says it’s like when parents want to know who their child is dating.
If kids go to a friend’s house or meet friends at a mall, she said, parents ask, “Who are you going to see?” or “where do you know each other from?” The new tool, she said, “allows parents to have these types of conversations, while preserving the privacy and autonomy of the adolescent.”
These conversations are important, experts say. In an ideal world, parents would have constant chats with their children about social networks and the dangers that lie ahead on the internet.
But many kids use a surprising number of platforms, all of which are constantly evolving, and parents are hardly knowledgeable enough to navigate that world, according to Josh Golin, CEO of Fairplay, an organization that watches over children. on the internet.
“It would be much better to require platforms to take action rather than add to the burden on parents who are already overburdened,” Golin noted.
The new controls, he added, also don’t solve the many problems Snapchat has. From the fact that boys can lie about their age to the “compulsive use” that the platform encourages. It also doesn’t help that the messages disappear after a short time, which facilitates “cyberbullying”.
Farahnik Yadegar said Snapchat has taken “strong measures” to prevent kids from saying they are over 13. Those caught lying about their age will be banned from the platform immediately.
Even boys who are older than 13 lie about their age, exaggerating it, and are given a chance to correct it.
Detecting those lies is not easy, but the platforms have several ways to do it. If a guy’s friends are all younger, he’s likely to have exaggerated his age. Companies use artificial intelligence to detect inconsistencies.
The interests a user expresses may also reveal their true age. And parents can catch their children lying about their age if they try to activate the controls available to them and fail to find their child when entering their true age.
In March, state attorneys general launched a nationwide investigation of TikTok to study the platform’s possible harmful impact on children’s mental health.
TikTok is the most popular app among US teens, according to a report released Wednesday by the Pew Research Center, which found that 67% use the Chinese platform.
The company says it promotes age-appropriate user experiences and says some services, like direct messaging, aren’t available to younger users. He claims that there is a tool that allows monitoring the time a boy spends on the platform and what he sees.
But there are those who say that these measures are not enough. “It’s easy for kids to get around these controls and do whatever they want,” said Common Sense Media’s Ly.
Instagram, owned by Facebook and parent company Meta, is the second most popular app among kids, according to Pew. 62% use it, followed by Snapchat, with 59%. Only 32% said they use Facebook, compared to 71% in 2014 and 2015.
Last year, former Facebook employee Frances Haugen revealed that the network was aware that the algorithms it uses were contributing to the mental disorders of many children who use Instagram, especially women.
That revelation caused several measures to be taken, but Ly commented that “they go around the issue, without attacking the roots.”