
If you’ve read anything or heard anything about Apple’s recent announcement about increased child safety, you may be in one of four camps with respect to how the change affects you, your children, and your privacy. Those who support the change wholeheartedly, those who immediately began phone shopping for anything not made by Apple, those who are concerned about the privacy ramifications, and those who have zero idea how to feel.
Before the internet trolls jump in and start pointing fingers and attempting to get muddy, We 100% support protecting children from those who wish to do harm to them. This article is presented to provide some facts about the change to a level, which will allow our readers to make an informed decision and determine which camp they most closely align with.
“Apple’s method of detecting known [Child Sexual Abuse Material] CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.” [1]
One important aspect of the scanning process is the scanning occurs on the local device as opposed to the cloud servers. By performing the scanning on the local device, end to end encryption remains in-tact.
There is a lot to consider regarding this change, and the decision on whether to move away from using Apple devices is yours to decide. If you decide to move away, CIO Services is standing by to assist you with migrating your data to a new platform and can make recommendations on what to use. If you decide to stay, we can assist with securing your data and ensuring you are setup properly to either opt-in or stay opted-out of CSAM detection.
Our opinion is simple. Don’t do illegal stuff!

Here are some FAQs to provide more information. If your question isn’t listed, contact CIO Services to discuss.
Why is Apple making these changes?
Apple’s goal is to “…help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).”[1]
What about the privacy of my messages? Will this change allow Apple or others to read my text messages?
“The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.” [1]
How does this technology analyze the pictures in iMessage?
“Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.” [1]
Will I, as a parent, be notified each time my child is warned about the image they’re sending or receiving?
It depends upon how your child’s account is configured and whether or not you have properly setup your family in iCloud. Contact CIO Services for a review of your family setup. The notice sending feature is an opt-in feature, as opposed to an opt-out feature and the notices only apply for child accounts of children 12 years of age or younger. Children accounts aged 13-17 will be warned, but notifications are not sent to the parent.
How can I turn CSAM detection off?
CSAM scanning only occurs on images destined to be uploaded to iCloud (the scanning occurs on the local device before it is sent to iCloud, not in the cloud). Disabling iCloud photos will disable CSAM detection. CIO Services can assist you with this step while implementing a back up solution to protect against data loss of your precious photos. Disabling iCloud photos will expose your pictures to risk of loss should your phone be lost, damaged, or stolen.
If I take an innocent picture of my child in the bathtub or other innocent pictures involving nudity of my child, will my account be flagged or locked?
Apple provides the following answer, “CSAM detection for iCloud Photos is designed to find matches to known CSAM images. The system uses image hashes that are based on images acquired and validated to be CSAM by at least two child safety organizations. It is not designed for images that contain child nudity that are not known CSAM images.” [2]
[1] Expanded Protections for Children (https://www.apple.com/child-safety/)
[2] Expanded Protections for Children – Frequently Asked Questions v1.1 (https://www.apple.com/child-safety/pdf/Expanded_Protections_for_Children_Frequently_Asked_Questions.pdf)

CIO Services, LLC is a Las Vegas based MSP and IT Consulting firm focusing primarily on small to medium sized businesses and is a contributor to tech journals and blogs alike. Please see below for links to their website and social media sites.
Leave a Reply