In context: It’s a sad fact that many women on social media will have unintentionally viewed an unsolicited nude photo in their DMs at some point. It’s a problem that’s especially prevalent on Instagram, but a new tool could help prevent such incidents, and without compromising the receiver’s privacy.
Instagram parent Meta confirmed to The Verge that it is developing a nudity protection feature for the photo-and-video sharing platform. Researcher Alessandro Paluzzi tweeted a screengrab of the technology, which “covers photos that may contain nudity in chat.” People will still be able to view these images if they choose to.
#Instagram is working on nudity protection for chats ‘
ℹ️ Technology on your device covers photos that may contain nudity in chats. Instagram CAN’T access photos. pic.twitter.com/iA4wO89DFd
— Alessandro Paluzzi (@alex193a) September 19, 2022
Meta emphasized that technology does not allow the company or third parties to access users’ private messages. “We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive,” said Meta spokesperson Liz Fernandez.
Meta compared the technology to its Hidden Words feature launched last year. It allows users to automatically filter offensive words and phrases chosen by a user, such as harassing and racist content, into a Hidden Folder; it’s not deleted entirely. The feature also filters DM requests that are likely to be spam or low-quality.
Last year, a report from the Center for Countering Digital Hate (CCDH) found that Instagram fails to act on 9 out of 10 abusive accounts and that cyberflashers account for a disproportionate amount of image-based abuse of high-profile women on the platform.
Cyberflashing is already illegal in France and Ireland. It will also become a criminal offense in the UK if parliament passes the Online Safety Bill. Most of the US does not consider cyberflashing a crime. However, it is a misdemeanor in Texas, and California’s legislature and senate voted unanimously last month to pass a bill that will allow those who receive unsolicited sexually graphic material by text, email, app, or other electronic means to sue the sender.
The California bill would allow recipients to recover at least $1,500 and as much as $30,000 from senders of obscene material older than 18, as well as punitive damages and attorney’s fees. Victims could also seek court orders blocking such future behavior, report NBC Los Angeles.