Liv McMahon
Technological journalist
Getty images
Meta extends adolescent accounts – which he considers his experience adapted to age for those under 18 – to Facebook and Messenger.
The system is to put younger adolescents by default on the platforms in smaller settings, with a parental authorization required in order to disseminate live or to deactivate image protections for messages.
It was introduced for the first time last September on Instagram, which dit Meta “has fundamentally changed the experience of adolescents” on the platform.
But activists say it is clear what difference the adolescent accounts have really done.
“Eight months after the trip of adolescent accounts on Instagram, we were silent of Mark Zuckerberg on the question of whether it was effective and even what sensitive content he approaches,” said Andy Burrows, Managing Director of Molly Rose Foundation.
He added that it was “appalling” that the parents still did not know if the parameters prevented their children from being “of inappropriate or hargoing content”.
But Drew Benvie, director general of the social media consulting firm, Battrenhall, said it was a step in the right direction.
“For once, Big Social fights for the leadership position not for the most committed, but for the safest user base for adolescents,” he said.
However, he also pointed out that there was a risk, as with all platforms, that adolescents could “find a way to get around the security parameters”.
The extended deployment of adolescent accounts begins in the United Kingdom, the United States, Australia and Canada from Tuesday.
Companies that provide popular services to children have faced pressure to introduce parental controls or safety mechanisms to protect their experiences.
In the United Kingdom, they are also faced with legal requirements to prevent children from meeting harmful and illegal content on their platforms, under the online security law.
Roblox recently enabled parents to block specific games or experiences on the extremely popular platform as part of their checks.
What are adolescent accounts?
The functioning of adolescent accounts depends on the self-declared user’s age.
People aged 16 to 18 will be able to move the default security parameters, such as their defined account on private.
But the 13 to 15 years must obtain parental authorization to deactivate these parameters – which can only be done by adding a parent or a tutor to his account.
Meta says he has moved at least 54 million adolescents worldwide in adolescent accounts since their introduction in September.
He indicates that 97% of the 13 to 15 years have also retained its integrated restrictions.
The system is based on truthful users about their age when they set up accounts – with meta -user methods such as video selfies to check their information.
He said that in 2024, he would begin to use artificial intelligence (IA) to identify adolescents who could lie about their age in order to put them back in adolescent accounts.
The results published by the British media regulator ofcom in November 2024 suggested that 22% of eight to 17 years are that they are 18 or more on social media applications.
Some teenagers told BBC that it was always “so easy” to lie about their age on platforms.
Meta
Meta will inform those under 18 on Facebook and Messenger that their account will become an account for adolescents via integrated notifications.
In the coming months, young adolescents will also need parental consent to be online on Instagram or deactivate nudity protection – which blurs naked images suspected in direct messages.
The concerns about children and adolescents receiving unwanted naked or sexual images, or feeling obliged to share them in potential sextory scams, prompted Meta calls to take more difficult measures.
Professor Sonia Livingstone, director of Digital Futures for Children Center, said that Meta’s expansion of adolescent accounts could be a welcome decision in the middle of “a growing desire for parents and children for social media adapted to age”.
But she said that the questions remained on the company’s overall protections for young people, “as well as its own data and highly marketed practices”.
“The meta must be responsible for its effects on young people, whether or not they use a teenage account,” she added.