On social media, we open ourselves up to the world. Unfortunately, some people in that world have malintent. Particularly vulnerable to social media scams are image-focused platforms like Instagram. To combat a rising issue of image extortion, Instagram is rolling out new safety features as of today.
Instagram Is Adding New Safety Features to Prevent Image Abuse
Announced today via Meta’s Newsroom, as part of a collaboration with the National Center for Missing and Exploited Children (NCMEC) and Thorn, Instagram will now put new measures in place to prevent crimes like “sextortion” and “intimate image abuse.” These types of scams often target younger, teen users and involve the threat of distributing an intimate photo unless payment is received—or, extortion.
Instagram’s new measures include protecting disappearing images from being captured by screenshots, monitoring follow requests, and nudity censorship.
Disappearing Image Protection
A feature rolling out “soon” to all Instagram users will be the inability to take screenshots or screen records of ephemeral images. These are photos or videos sent with the “view once” or “allow replay” feature within Instagram DMs or the Messenger app.
Previously, if you sent a disappearing message to someone, they could capture it with a screenshot, and you would be notified that they did so. However, if you try to take a screenshot of a disappearing image now, you’ll just capture a black screen with an Instagram notification letting you know you “can’t screenshot or record this.”
Follow Requests and Follower Lists
Starting today, Instagram will also start flagging profiles that have certain qualities representative of scams, such as newly created accounts. When those profiles send follow requests to teen users, Instagram will block or move the request to the spam folder. For any accounts that slip through the cracks, Instagram may also send teens an alert, such as if the user they’re talking to is located in a different country.
Additionally, as lists of followers and following have previously been leveraged in extortion tactics, Instagram will also block those lists from accounts it flags for “scammy behavior.”
Nudity Censorship for Teens
Following Instagram’s roll-out of Teen Accounts last month, the platform will now apply one of its beta safety measures to all Teen Accounts, as well as offer it as an optional feature to all users worldwide. Instagram’s nudity detection feature blurs images that the algorithm deems as containing nudity in Instagram DMs.
Additional Crisis Resources
Safety features can only go so far, and unfortunately criminals will find ways to work around such protections. For this reason, Instagram is providing additional resources, such as integrating a Crisis Text Line for US users to reach out to when they suspect scams.
However, a resource like a text line only works when people know to ask for help. To combat the potential embarrassment or avoidance of reaching out for help, Instagram has worked with the NCMEC and Thorn to create an educational campaign, including a video documenting red flags and best practices.
Sadly, it seems this vicious crime is on the rise, with “reports of online enticement increasing by over 300% from 2021 to 2023,” according to the NCMEC. Of course, social media is ripe for scams, but as long as you know what to look for, you can typically avoid them. Education is key, and I’m pleased to see a social media heavyweight like Meta putting effort into not only features, but resources to equip its users with knowledge.