BBC news research

Young Instagram users could still be exposed to “serious risks” even if they use new teenage accounts brought to provide more protection and control, it suggests the investigation of activists.
Researchers behind a new report have said that capable of establishing accounts using false birthdays and that they had sexualized content, hate comments and recommended adult accounts to follow.
Meta, which has Instagram, says that its new accounts have “incorporated protections” and shares “the goal of keeping adolescents safe online.”
The investigation, of the Foundation of Children’s Security online 5 Rights, was launched as ofcom, the United Kingdom regulator is about to publish the security codes of their children.
They will outline the rules platforms that will have to continue under the online security law. The platforms then demonstrate that they have established systems that protect children.
That includes robust age controls, safer algorithms that do not recommend harmful content and effective content moderation.
Instagram teenage accounts were established in September 2024 to sacrifice new protections for children and create what target called “tranquility for parents.”
The new accounts were designed to limit who could contact users and reduce the amount of content that young people could see.
Existing users would be transferred to the new accounts and those that are registered for the first time would automatically obtain one.
But the 5rights Foundation researchers were able to configure a series of false adolescents using false birthdays, without additional checks from the platform.
They discovered that immediately when registered they were sacrificed accounts for adults to follow and send messages.
Instagram algorithms, they claim, “still promote sexualized images, ideal harmful beauty and other negative stereotypes.”
The researchers said their teenage accounts were also recommended publications “full of significant amounts of hate comments.”
The beneficial organization also had concerns about the addictive nature of application and exposure to marketed and sponsored content.
Baroness Beeban Kidron, founder of the 5right Foundation, said: “This is not a teenage environment.”
“They are not reviewing age, they are incoming adults, they put them in commercial situations without letting them know and is deeply sexualized.”
Meta said that the accounts “provide built -in protections for adolescents who limit who gets in touch with them, the content they can see and the time spent on our applications.”
“Teenagers in the United Kingdom have automatically moved to these improved protections and under 16 years of age need a father’s permission to change them,” he added.

In a separate development, BBC News has also learned about the existence of groups dedicated to self -harm in X.
The groups or “communities”, as are known on the platform, contain tens of thousands of members who share graphic images and autolesion videos.
Some of the users involved in the groups seem to be children.
Becca Spinks, an American researcher who discovered the groups, said: “He was absolutely disheveled to see 65,000 members of a community.”
“It was so graphic that there were people there, surveys on where they should cut below.”
X was contacted to comment, but did not respond.
But in a presentation to a OFCOM consultation last year, X said: “We have clear rules to protect service safety and people use it.”
“In the United Kingdom, X is committed to complying with the online security law,” he added.