Like every year, like clockwork, a new version of iOS has dropped. The latest, iOS 15, not only improves some of iOS 14’s staple features but also brings a bunch of new stuff. As always, there has been some controversy surrounding the new OS. This time, it comes in the form of Child Safety. This new addition practically makes way for Apple to scan photos in your gallery, which has ruffled quite a few feathers in the industry.
Today, we will tell you how this technology works and tell you whether Apple’s iOS 15 scans your private photos.
Related: iOS 15 Child Safety Explained
Will iOS 15 scan photos in your gallery?
Child Safety in iOS 15 is a newly introduced feature that is meant to stop the distribution of Child Sexual Abuse Material or CSAM. To make it work, all iOS 15-running devices does scan for problematic content on every user’s device and notify the authority if necessary. So, technically, yes, iOS 15 does scan your photos under certain circumstances.
However, the situation, at least from Apple’s perspective, is a lot less dire than most are making out to be. Instead, Apple believes its CSAM scanning is a lot more secure than the techniques its competitors are using.
Related: The best Android games for kids [Age group-wise]
How does photo scanning work in iOS 15?
If you go by the sensationalized reports, you would think that Apple employees might actually be looking into your private photos one by one. Yes, we understand that these wild reports originated from Apple’s overly simplistic feature announcement, but the company has since gone on and added more clarity to the Child Safety tech or CSAM scanning.
Unlike what is being perceived, Apple does not scan and look into the private photos in your gallery. Instead, it downloads a database of known CSAM images from the National Center for Missing and Exploited Children (NCMEC) and matches it against the photos in your gallery.
The downloaded database is converted into string (of numbers) format, which means the users are not seeing the images on their iPhones. If a match is found, National Center for Missing and Exploited Children (NCMEC) gets notified, and then a human reviews the matches personally. Finally, if the representative finds the image problematic, appropriate actions are taken against the perpetrator.
Apple’s clarification confirms that no human Apple employee scans and reviews your photos, at least right away. The NCMEC database is the first layer of verification. If and only if a match is found, the photo is sent over to NCMEC where it is verified by a human.
Related: Will iPhone 8 Get iOS 15? When Does Apple Support End?
Can you stop iOS 15 from scanning your photos?
Yes, stopping iOS 15 from scanning your photos is rather straightforward. All you have to do is stop uploading your photos to iCloud and you are good to go. Apple has confirmed that the check is performed only if you have opted to upload your photos to iCloud. In 2021, iCloud has over 170 million premium users, meaning a sizable chunk of the smartphone population is affected by the CSAM scanning functionality.
Related: Where To Find All the New iOS 15 Features
Is iOS 15 comprising your privacy?
The doubters have their reservations about Apple’s new policy and rightly so. Apple has always been the strongest advocator of privacy, which is seemingly going for a toss in their CSAM scanning implementation. However, if Apple’s justification is to be believed, there is no cause for concern whatsoever. Instead, it thinks iOS 15’s CSAM implementation is more private than other implementations out there.
Unlike other services that scan your entire library, Apple promises that it only matches your library with the on-device CSAM database, instead of scanning your entire iCloud library. If a match is found, then only your photo leaves your device and is viewable by another human. Also, Apple is making the CSAM databases auditable, hoping to put most of the doubters’ minds at ease.
Many still believe that governments could use iOS 15’s CSAM scanning as a backdoor to look for wanted fugitives, target certain demographics, and more. Apple has reassured that iOS 15’s image scanning is only built for CSAM databases and cannot be configured otherwise.
Apple has landed itself in a tough spot with CSAM scanning, and it is anybody’s guess whether the most valuable company in the world would be able to get out of it.
Related: How To Fix Safari Issues on iOS 15
Has Child Safety features gone live in iOS 15?
Since announcing the features, Apple has faced a lot of criticism regarding its implementation. So much so that it has not rolled out the Child Safety feature set yet. Apple cannot afford to delay this for long, though, as Child Safety advocates would eventually start to get antsy. Do not be surprised if Apple holds a special event only to explain the CSAM implementation in detail.
RELATED
- Focus Not Working on iOS 15? 10 Ways to Fix the Issue
- iOS 15: How To Drag and Duplicate Tabs on Safari on Your iPhone and iPad
- iOS 15: How To Enable iCloud Backup Over Mobile Data on your iPhone
- iOS 15 Per-App Accessibility: How To Change Appearance of any App or Home screen
- iOS 15: How To See Shazam History on Your iPhone and iPad
Discussion