Well, we suppose it's a sign of the times, but anyway… a new app, appropriately called Nude, claims to have developed technology that can identify, group, and make your nude pictures stored on your smartphone disappear from public access, all without your help.
Currently, the app is only available for iOS devices but an Android-based app is in development according to the team behind Nude.
Automating the process of removing potentially embarrassing photos from your phone might sound promising but is it safe?
Well, you decide.
The app works by examining your photos for sensitive material using algorithms designed just for that purpose. Nude then removes your photos from your phone and iCloud storage and keeps them stored locally, within the app. The pics are then stored in a PIN-protected vault.
For iOS 11 users, the entire process is local and uses no outside source to view your photos, relying instead upon the app’s built-in machine learning.
That is unless you use iOS 10. The iteration of the app for iOS version 10 and under, utilizes Amazon Rekognition technology which means the photos are, if only briefly, sent to a cloud according to DIY Photography.
So, if you’re cool with letting this app’s cloud AI potentially looking at your candids (or someone else’s for that matter), then this app is probably a solution for you…if you also happen to have so many nudes on your smartphone that you can’t possibly be bothered to catalog them manually.
To use Nude, new subscribers will need to sign up for the service. An annual subscription costs $10 dollars.
Nude photos are not the app's only specialty – it also works to protect sensitive documents and materials stored on your phone as well, such as driver’s licenses, credit cards, and other information.
The app has a built-in camera just for this functionality. Additionally, Nude has other security measures: In case someone tries to access your app with the wrong PIN number, it will take a picture of the user with the front-facing camera.
Reactions to the app are mixed, with Gizmodo’s Melanie Ehrenkranz complaining about the app’s lack of a basic understanding of human anatomy, seemingly classifying innocent photos as NSFW images that needed to be archived.
Ehrenkranz is pretty unequivocal in her criticism, writing “after letting Nude troll through my camera roll, I’m not convinced this algorithm has ever seen a naked body.”
According to Ehrenkranz, she let the app analyze over two thousand images, which took it approximately thirty minutes. While she didn’t have a lot of explicit images on her phone, the app nonetheless deemed images explicit seemingly for the hell of it – classifying memes and images of Pokemon as in need of top-secret classification.
Addressing this discrepancy, app creators Jessica Chiu and Y.C. Chen told Gizmodo in an email: “When it comes to the sensitivity of nude detection, we tried to play it safe…There will always be some borderline false positive, and we are leaning towards catch-them-all rather than failing to detect some sensitive content. With that being said, we do recommend all our users update their iPhone to iOS 11 before installing our app. CoreML has proven to be the most accurate when running our ML model, unfortunately, Apple makes it so that CoreML would only work on iOS 11.”
Of course, the creators' promise of improvement over time might not be enough to convince a lot of people to fork over a $10 annual subscription now while the service is still in its infancy. Still, if you're somebody that stores a lot of personal photos on your smartphone, it might be worth a look.
You can download Nude for iOS or visit their website by clicking here to learn more.