Meta has announced it will deploy artificial intelligence to scan photos and videos on Facebook and Instagram for visual clues that a user may be under 13 years old, as the company faces mounting legal and regulatory pressure over child safety on its platforms.
The system analyses general visual characteristics such as height and bone structure to estimate a user’s approximate age. Meta was explicit that the technology does not constitute facial recognition. “Our AI looks at general themes and visual cues, for example height or bone structure, to estimate someone’s general age; it does not identify the specific person in the image,” the company said in a blog post. The visual analysis operates alongside existing tools that scan text across profiles, including posts, comments, bios, and captions, for contextual signals such as birthday celebrations or references to school grades.
If the system determines that a user may be underage, their account will be deactivated. The user must then complete Meta’s age verification process to prevent permanent deletion. The visual analysis system is currently operating in select countries, with a broader rollout planned. Meta also said it intends to expand the technology to Instagram Live and Facebook Groups in the future.
The announcement also included the expansion of Meta’s Teen Accounts feature, which places users in a stricter account experience with additional safeguards including receiving direct messages only from people they follow or are connected to, hidden harmful comments, and private account settings by default. Teen Accounts are being expanded to 27 countries across the European Union and Brazil on Instagram, and are coming to Facebook in the United States for the first time, followed by the UK and EU in June.
The timing of the announcement is notable. It comes weeks after a New Mexico jury ordered Meta to pay $375 million in civil penalties for misleading consumers about platform safety and putting children at risk, and ordered the company to implement fundamental changes to how its platforms operate. Meta has since threatened to shut down its social media services in New Mexico in response to the ruling. The company faces numerous additional lawsuits over child safety across the country, making Tuesday’s announcement part of a broader pattern of reactive measures as legal and public pressure intensifies.



