Apple wants to snoop on iPhone photos

By Jefferson Graham

The photos on your iPhone will no longer be private to just you in the fall.

The photos are still yours, but Apple’s artificial intelligence will be looking through them constantly

Apple’s algorithm will be inspecting them, looking out for potential child abuse and nude photo issues. The NSA whistleblower Edward Snowden,, from his exile in Russia, sees it this way: Apple wants to “transform your iPhone into a spyPhone.”

On one hand, this is a great thing. Just ask former Rep. Katie Hill, who had to resign her post after nude smartphone photos of her doing things she didn’t want public were shared. Revenge Porn is a terrible side effect of the digital revolution.

And anyone using their phones to exploit children and swap child porn is a sick puppy that deserves the book thrown at them.

My take: It’s not good and could lead to more Big Tech and government inspection of our property, our photos. The algorithm will be making decisions about your images, and there’s no way to put a positive spin on that.

That cute little baby picture of your son or daughter residing on your smartphone could land you into trouble, even though Apple says it won’t. 

Once Apple starts inspecting photos in the name of protecting children, when do governments and law enforcement start chiming for their interests, whether that be political protest or thefts? What happens to your privacy then?

Those are just some of the nagging questions that linger from the Apple announcement, which lands with a thud, since it comes from a company that has made such a big deal about being the pro-privacy firm, the anti-Facebook and Google. Those two firms, of course, are known for tracking your every move to help sell more advertising.

The changes become effective with the release of updated operating systems, iOS15 for iPhones and updates for the iPad, Apple Watch and Mac computers. If the changes concern you, don’t upgrade. But eventually you’ll lose this battle, and find that your devices won’t work unless you do the upgrade. Sorry folks.

Snowden was joined in his criticism of the new Apple policy by the head of Facebook’s WhatsApp message service, who said he has no plans to start snooping on our photos. 

Jefferson Graham holds an iPhone for a portrait.

Let’s dive in a little closer:

iMessages:

If you send a text message, generated on the iPhone, iPad, Apple Watch or on a Mac computer, and have a family iCloud account, Apple will have new tools “to warn children and their parents when receiving or sending sexually explicit photos.” Caveat: parents have to sign up for these tools, they’re not automatically inflicted upon us. And Apple defines children as anyone under the age of 12. 

Pro: No more bullying when kids do the wrong thing and allow themselves to be photographed in the nude. Because this always seems to run into problems beyond the subject and photographer. Too many stories are out there of these images being shared and going viral.

Con: Apple is inspecting the contents of the photos on your phone. How does it know the actual age of the participants in the photo? Imagine the fun in the household when your parents start screaming , because Apple has told them of the nudes on your phone? And once you start down this slippery slope, where does it go from here? Nudes today, then what? Will foreign governments want the right to inspect photos for other reasons? And let’s face it, if blocked by Apple, the kids will go elsewhere to share the photos.

Solution: this should be obvious, but don’t shoot or share nudes on your iPhone. It can only get you into trouble. 

Child Abuse Monitoring

With the software update, photos and videos stored on Apple’s iCloud online backup will be monitored for potential child porn and if detected, reported to authorities. Apple says it can detect this by using a database of child abuse “image hashes,” as opposed to inspecting the image itself. Apple insists that its system is near foolproof, with “less than a one in one trillion chance per year of incorrectly flagging,” a given account.

Where have I heard that one before? Oh yeah, FaceID, which Apple said would be a safer way to unlock the phone and that the odds of a random stranger instead of you being able to unlock the phone was approximately 1 in a million. That may be, but I only know that since the advent of Face ID, the phone rarely, if ever, recognizes me, and I have to type in the passcode all day long instead.

Pro: Smartphones have made it easier for the mentally sick to engage in the trading of child porn, and by Apple taking a stand, it will make it harder for folks to share the images.

Con: Apple’s announcement is noble, but there’s still the Pornhubs and worse of the world. Apple apologists say the company isn’t really inspecting your photos, but using these “hash” symbols to monitor the contents instead. Whatever it is, I’m with Snowden on this one: it’s turning your iPhone into a spyPhone, and can only lead to bad things. After a manual review, Apple says it will disable your account and send off the info to authorities. Say you did get flagged—who wants to get a note with a subject header about child abuse? And hear from your local police department as well? Once Apple flags you, the user can file an appeal and try to get their account reinstated. Whoa! Oh boy.

Solution: I’m not a fan of iCloud as it is, since there’s a known issue with deleting. If you kill a a synced photo from your iPhone or iPad, it says goodbye to iCloud too. I prefer SmugMug, Google Drive and other avenues for safer online backup. With what Apple is doing to inspect photos, whether that be good, bad or indifferent, what good could come of uploading anything there? 

My take: these are well-intentioned ideas aimed at solving societal ills, but it is doomed, an example of Big Tech going too far. Technology makes mistakes, innocent people will be tarred (ever hear of someone getting arrested who was innocent?) and Apple has no right to snoop on my photos. Period. (Apple’s side of the story is explained in a new FAQ the company posted Monday.)

So what to do? Let Apple know what you think of the new program. Scream loudly about it on Twitter. Resist the nag messages from Apple to update your software in the fall. Don’t shoot nudes on your iPhone. Store your photos online and make lots of backups, but not on iCloud.

This column was originally published in Jefferson Graham’s Photowalks newsletter. Subscribe here: http://jeffersongraham.substack.com

Comments:

comments so far. Comments posted to EasyReaderNews.com may be reprinted in the Easy Reader print edition, which is published each Thursday.