Skip to main content Skip to section navigation

Blog: Apple must stand by its image detection safety plan, ignore privacy zealot push for online status quo


Written by , Executive Director of the Canadian Centre for Child Protection

When your life’s work is devoted to improving the safety of children online, you can’t help but greet technology companies’ grand announcements and promises with a great deal of skepticism.

Until recently, cajoling technology companies into giving an inch in the direction of safety and regulation was like trying to push water uphill. However, children, survivors and NGOs working in this space are now beginning to witness glimmers of hope that seem to suggest meaningful change can happen.

And so, when Apple—one of the industry laggards on child-safety policies—announced its plan to deploy a tool designed to detect when users upload child sexual abuse imagery onto its servers, I viewed it as another incremental win for the cause. Coming from Apple, a company revered by many as leaders and pioneers in the digital space, symbolically, it was also a pivotal moment.

Fast forward five weeks... that’s how long it took for Apple to cancel its original plan and pause the roll out indefinitely.

When the new policy was first announced, what followed was a barrage of misinformation and fear mongering about a surveillance state. Critics cried foul with exaggerated predictions of Apple infiltrating your phone’s content and law enforcement kicking in innocent users’ doors.

It seems that every time entities operating in the digital space conclude they ought to take steps to limit online crimes against children on their platforms, these actions are immediately met with myopic criticism from technologists and libertarian groups alike, claiming to be great defenders of your right to Privacy. No matter the amount of transparency or assurances provided by Apple, the negative feedback loop would not relent in the headlines.

Under Apple’s proactive detection plan, images stored locally on a user’s device are scanned for matches of known child sexual abuse imagery that have been vetted by two independent child protection organizations. However, even if matches are identified, no further action is taken unless the user uploads at minimum 30 offending images to iCloud (Apple’s servers). When these conditions are met, Apple performs a manual review of the content in advance of any information being sent to authorities.

As is turns out, the type of local device access Apple is proposing is no different than the consensual access the vast majority of smartphone users grant to hundreds of applications without a second thought.

On Facebook, Instagram, WhatsApp, VSCO, Snapchat, TikTok and many more apps, billions of users agree to provide these services their entire contacts list, real-time location, microphone and photo gallery access.

These “privacy” groups’ chief argument is that Apple could feasibly be compelled to abuse this very limited and issue-specific access for the benefit of government and law enforcement. Why then are they not also sounding the alarm over the exact same hypothetical outcome with all the other apps currently siphoning locally-stored data from its many users?

The truth is that these groups don’t actually care all too much about privacy. Their true motivation is minimizing the regulation of the internet. At every opportunity, they sow doubt and block anything that challenges the existing online status quo that was established decades before anyone understood the full spectrum of harm resulting from the emergence of a truly global internet.

These groups argue in hyperbole and largely theoretical outcomes, as though authoritarian governments are idly standing by waiting for technology companies to be the first mover and provide them opportunities to invade the privacy of their citizens. The slippery slope argument is perpetually invoked as a means to stunt all progress as though absolute privacy is the only consideration that weighs in the balance. Meanwhile, as organizations looking out for the safety and privacy of children know all too well, thousands of children are exploited and abused online daily, courtesy of a largely unfettered internet.

In working closely with survivors whose lives are forever impacted by the knowledge that images of their childhood abuse circulate online in perpetuity, I have heard one thing again and again: The healing process cannot fully take place until those images cease to exist. What Apple is proposing, while imperfect, supports this need.

Curiously, I have yet to hear any of these critics show great concern or propose any solutions for the never-ending invasion of privacy that victims of child sexual abuse material experience every day of their lives. Rather they insist on reminding us of cherry-picked hypothetical dangers over the real-world damage happening before their very eyes, while pressuring companies to abandon any move toward a safer internet.

For the sake of children and survivors, I hope Apple pushes back against the unwarranted criticisms, revives this important initiative and sets an example for the rest of the technology industry to follow.

-30-

About the Canadian Centre for Child Protection: The Canadian Centre for Child Protection (C3P) is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child serving organizations, law enforcement, and other parties. C3P also operates Cybertip.ca, Canada’s tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of child sexual abuse material (CSAM) on the clear and dark web and issue removal notices to industry.

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now