Meta to blur Instagram messages containing nudity in latest move for teen safety

April 11, 2024 – 1:18 AM PDT

A logo of mobile application Instagram is seen on a mobile phone, during a conference in Mumbai, India, September 20, 2023. REUTERS/Francis Mascarenhas/File Photo
A logo of mobile application Instagram is seen on a mobile phone, during a conference in Mumbai, India, September 20, 2023. REUTERS/Francis Mascarenhas/File Photo

(Reuters) – Instagram will test features that blur messages containing nudity to safeguard teens and prevent potential scammers from reaching them, its parent Meta (META.O) said on Thursday as it tries to allay concerns over harmful content on its apps.

Advertisement

The tech giant is under mounting pressure in the United States and Europe over allegations that its apps were addictive and have fueled mental health issues among young people.

Meta said the protection feature for Instagram’s direct messages would use on-device machine learning to analyze whether an image sent through the service contains nudity.

The feature will be turned on by default for users under 18 and Meta will notify adults to encourage them to turn it on.

“Because the images are analyzed on the device itself, nudity protection will also work in end-to-end encrypted chats, where Meta won’t have access to these images – unless someone chooses to report them to us,” the company said.

Unlike Meta’s Messenger and WhatsApp apps, direct messages on Instagram are not encrypted but the company has said it plans to roll out encryption for the service.

Meta also said that it was developing technology to help identify accounts that might be potentially engaging in sextortion scams and that it was testing new pop-up messages for users who might have interacted with such accounts.

In January, the social media giant had said it would hide more content from teens on Facebook and Instagram, adding this would make it more difficult for them to come across sensitive content such as suicide, self-harm and eating disorders.

Attorneys general of 33 U.S. states, including California and New York, sued the company in October, saying it repeatedly misled the public about the dangers of its platforms.

In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.

Reporting by Granth Vanaik in Bengaluru; Editing by Aditya Soni and Alan Barona

Share this post!