Instagram will automatically blur nude images in direct messages sent to users under 18 by default and encourage adult users through a notification to turn on the feature, the company announced Thursday. |
© Kirill Kudryavtsev/AFP via Getty Images |
The update aims to both protect users from seeing unwanted nudity in their direct messages as well as from potential sextortion scammers who may send nudge images to get others to send their own back, Instagram said in a blog post.
It comes after Instagram’s parent company Meta has faced pressure along with other social media platforms to put more controls in place to protect teens online from potential harms, including sextortion scams in which someone threatens to expose sensitive images unless the victim meets certain demands.
The update will automatically blur nude photos under a warning screen that says a “photo may contain nudity” before a user chooses to view it or not. The app will also send users a message that tells them “don’t feel pressured to respond” and an option to block the sender and report the chat.
When the nudity protection feature is turned on, users who send images containing nudity will also see a message that reminds them to “take care when sharing sensitive photos,” and they will be allowed to “unsend these photos” if they have changed their mind.
A similar notification will pop up for anyone who tries to forward a nude image they’ve received, urging them to reconsider before they share.
Read more in a full report at TheHill.com. |