Does iOS 15 Scan Photos? [Explained]

 

Like every year, like clockwork, a new version of iOS has dropped. The latest, iOS 15, not only improves some of iOS 14’s staple features but also brings a bunch of new stuff. As always, there has been some controversy surrounding the new OS. This time, it comes in the form of Child Safety. This new addition practically makes way for Apple to scan photos in your gallery, which has ruffled quite a few feathers in the industry. 

Today, we will tell you how this technology works and tell you whether Apple’s iOS 15 scans your private photos. 

Related: iOS 15 Child Safety Explained

Will iOS 15 scan photos in your gallery?

Child Safety in iOS 15 is a newly introduced feature that is meant to stop the distribution of Child Sexual Abuse Material or CSAM. To make it work, all iOS 15-running devices does scan for problematic content on every user’s device and notify the authority if necessary. So, technically, yes, iOS 15 does scan your photos under certain circumstances.

However, the situation, at least from Apple’s perspective, is a lot less dire than most are making out to be. Instead, Apple believes its CSAM scanning is a lot more secure than the techniques its competitors are using.

Related: The best Android games for kids [Age group-wise]

How does photo scanning work in iOS 15?

If you go by the sensationalized reports, you would think that Apple employees might actually be looking into your private photos one by one. Yes, we understand that these wild reports originated from Apple’s overly simplistic feature announcement, but the company has since gone on and added more clarity to the Child Safety tech or CSAM scanning. 

Unlike what is being perceived, Apple does not scan and look into the private photos in your gallery. Instead, it downloads a database of known CSAM images from the National Center for Missing and Exploited Children (NCMEC) and matches it against the photos in your gallery.

The downloaded database is converted into string (of numbers) format, which means the users are not seeing the images on their iPhones. If a match is found, National Center for Missing and Exploited Children (NCMEC) gets notified, and then a human reviews the matches personally. Finally, if the representative finds the image problematic, appropriate actions are taken against the perpetrator. 

Apple’s clarification confirms that no human Apple employee scans and reviews your photos, at least right away. The NCMEC database is the first layer of verification. If and only if a match is found, the photo is sent over to NCMEC where it is verified by a human. 

Related: Will iPhone 8 Get iOS 15? When Does Apple Support End?

Can you stop iOS 15 from scanning your photos?

Yes, stopping iOS 15 from scanning your photos is rather straightforward. All you have to do is stop uploading your photos to iCloud and you are good to go. Apple has confirmed that the check is performed only if you have opted to upload your photos to iCloud. In 2021, iCloud has over 170 million premium users, meaning a sizable chunk of the smartphone population is affected by the CSAM scanning functionality. 

Related: Where To Find All the New iOS 15 Features

Is iOS 15 comprising your privacy? 

The doubters have their reservations about Apple’s new policy and rightly so. Apple has always been the strongest advocator of privacy, which is seemingly going for a toss in their CSAM scanning implementation. However, if Apple’s justification is to be believed, there is no cause for concern whatsoever. Instead, it thinks iOS 15’s CSAM implementation is more private than other implementations out there. 

Unlike other services that scan your entire library, Apple promises that it only matches your library with the on-device CSAM database, instead of scanning your entire iCloud library. If a match is found, then only your photo leaves your device and is viewable by another human. Also, Apple is making the CSAM databases auditable, hoping to put most of the doubters’ minds at ease.

Many still believe that governments could use iOS 15’s CSAM scanning as a backdoor to look for wanted fugitives, target certain demographics, and more. Apple has reassured that iOS 15’s image scanning is only built for CSAM databases and cannot be configured otherwise. 

Apple has landed itself in a tough spot with CSAM scanning, and it is anybody’s guess whether the most valuable company in the world would be able to get out of it. 

Related: How To Fix Safari Issues on iOS 15

Has Child Safety features gone live in iOS 15?

Since announcing the features, Apple has faced a lot of criticism regarding its implementation. So much so that it has not rolled out the Child Safety feature set yet. Apple cannot afford to delay this for long, though, as Child Safety advocates would eventually start to get antsy. Do not be surprised if Apple holds a special event only to explain the CSAM implementation in detail.

RELATED

Posted by
Sushan

A mediocre engineer hoping to do something extraordinary with his pen (well, keyboard). Loves Pink Floyd, lives football, and is always up for a cup of Americano.

2 Comments

  1. Hiding behind children as a way to rationalize away every person’s right to privacy is not rational. It’s actually troublesome, particularly from a company that wouldn’t give the FBI a software key to open the San Bernadino terrorist’s iphone, something to potentially save peoples’ lives from a future terrorist attack. This is not a company that has a good sense of morality and judgement. So Apple is suddenly interested in child safety by backing down on its promise to user privacy? I don’t buy it, and Apple couldn’t give a damn about kids’ safety, to be honest. If they did, they wouldn’t market their microwave high-SAR emission iphones to children and teens, who use them for hours in their hands, near their heads, and store them in their pockets. The only reason so many people are using Apple’s icloud is because they are unaware of this new move by Apple, and it’s a dumb one, because ultimately, it will have little, if any, effect on its intended purpose except to rid us of our privacy on a device that we purchased with our hard-earned money .

  2. Every intrusion starts small. I’m not willing to give an inch when it comes to client-side privacy. People shouldn’t even get used to the idea that it’s acceptable for a company to snoop on your devices and there will always be pressure to do more. Maybe hard drive makers should also phone home to China for the wrong political opinions. There are always emergencies and issues in the world. This is a philosophical line that should hold firm because it goes to the heart of what it means to be an independent human being and to have human rights. I do not care what their implementation is. It’s the concept I have a problem with.

Comments are closed.