Instagram exams new ‘Nudity Protection’ feature to protect users

Instagram exams new ‘Nudity Protection’ feature to protect users
Instagram exams new ‘Nudity Protection’ feature to protect users

Instagram’s trying out one more method for safeguarding the users from unwanted content exposure in the application via DMs, with a new nudity filter that would obstruct likely nudes in your IG Direct messages.
As outlined in this feature overview, uncovered by app researcher Alessandro Paluzzi, the new ‘nudity protection’ option would enable Instagram to activate the nudity detection element in iOS, released late last year, which scans incoming and outgoing messages on your device to detect potential nudes in attached images.
When detected, the system can then blue the image, as Instagram notes – which, significantly, implies that Instagram, and parent company Meta, wouldn’t download and examine your emails for this feature; it would be done on your device.

Obviously, that appears concerning: that your OS has the access to your messages, and filtering based on the content. Apple has worked to assure their users that it’s not downloading the actual images, and that this is done by means of AI and data matching, which doesn’t trace or track the particulars of your personal interactions

But still, you’d imagine that, somewhere, Apple is monitoring the number of images it detects and blurs through this process, and that could imply that it has details on the number of nudes that you’re probably being sent. Not that that would matter.

One way or the other, the potential advantage worth might offset any such worries (which are probably not going to at any point surface), and it could be another significant measure for Instagram, which has been attempting to carry out more security measures for more young users.
Last October, as part of the Wall Street Journal’s Facebook Files uncover, leaked internal documents were published which showed that Meta’s own research points to potential concerns with Instagram and the harmful mental health impacts it can have on teen users.
In response, Meta has carried out a range of new safety tools and features, including ‘Take a Break’ reminders and updated in-app ‘nudges’, which aim to redirect users away from potentially harmful topics. It has additionally extended its sensitive content defaults for young users, with all account holders under the age of 16 now being put into its most prohibitive exposure category.


News in the domain of Advertising, Marketing, Media and Business of Entertainment

More in Media