Protecting Kids Online: Social Media Platforms Face Tougher Age Checks

UK regulators demand that popular social media apps like Instagram, Snapchat, TikTok, YouTube, and Roblox strengthen age verification to prioritize child safety.
In a move to prioritize child safety, UK regulators are calling on leading social media platforms to implement more robust age verification systems for users under the age of 13. Instagram, Snapchat, TikTok, YouTube, and Roblox are among the tech giants targeted by this new directive, with authorities stating that these companies are not doing enough to put children's wellbeing at the heart of their product design.
The push for tougher age checks comes amid growing concerns over the impact of social media on young users. Researchers have linked excessive social media use to increased anxiety, depression, and body image issues in children and adolescents. Additionally, predators and cyberbullies often target minors on these platforms, exposing them to harmful content and interactions.
{{IMAGE_PLACEHOLDER}}Source: BBC News


