Skip to main content Skip to section navigation

DON’T LEAVE CSAM VICTIMS BEHIND IN LEGISLATIVE EFFORTS: STATEMENT FROM THE PHOENIX 11 TO THE FIVE COUNTRY MINISTERIAL


For Immediate Release

As we write to you today, the Phoenix 11 note the somber anniversaries that bring this statement before you. Six years ago, 11 women were brought together by the Canadian Centre for Child Protection and the National Center for Missing & Exploited Children with one thing in common: we survived child sexual abuse that was documented through photographs and videos which were – and continue to be - distributed in real time on the internet.

None of our life stories are exactly the same, but the imagery created of us while we suffered horrific acts against us as children is the thread that connected us. We met with trepidation and uncertainty, but we emerged from that meeting as the world’s first collective of child sexual abuse material survivor advocates. If we can speak of our shared experience where others are unable, we see it as our moral obligation to do so in hopes that current and future generations of our world’s children do not experience the same fate we have.

Over the last six years, we have done a tremendous amount of work. We have met with and spoken to governments, child safety organizations and professionals, law enforcement agencies, academics, researchers, and key stakeholders around the globe. We have shared the impacts of our trauma and the ongoing revictimization we experience with the knowledge that our most horrendous memories are free to circulate amongst communities of internet pedophiles. We have explained that child sexual abuse material (CSAM) was used to groom us, and the thought that our abuse could be used to groom current child victims haunts us. We have done extensive research and written statements on key issues that impact victims and survivors of CSAM worldwide. Through all of this, our purpose has never wavered: we are here to demand an end to child sexual abuse and exploitation on the internet.

In March 2020 we attended a roundtable at the White House which concluded with the Five Country Ministerial announcing their Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse. This was an opportunity for the tech industry to prioritize children over profit, and proactively enact policies that would reduce harm on their apps, platforms, and storage infrastructures. The Voluntary Principles have been endorsed by tech giants such as Apple, Meta, Snapchat, X and TikTok.

It has been our experience that the work done to make the internet a safer place for children is continuously undone by the tech industry. For every positive impact made there are countless children that tech has failed. We have seen tech exploit loopholes, in most cases, without significant legal consequences. The social media companies listed above endorsed the Voluntary Principles, yet these platforms have facilitated online harms against children so egregious that they were called to testify in front of the United States Senate Judiciary Committee in January 2024.

We acknowledge that each of the five countries present with us in 2020 and responsible for forming the Voluntary Principles have made attempts to make the internet safer for children. For those children who are growing up in a world consistently revolving around the internet and social media, this necessary progress has been a long time coming. However, somewhere along the way, we do feel that victims of child sexual abuse material have been left behind.

We represent the population of victims that no one seems to want to talk about as it applies to enacting internet regulation. We believe that this is largely because proactive detection and removal of known CSAM within encrypted spaces are both the spaces where these crimes thrive and the spaces that no one wants to touch. We are survivors of unspeakable acts of violence that were perpetrated against us as babies, toddlers, and children. Photographs and videos of our abuse were created by the very people who were supposed to keep us safe. That imagery was then sold or traded for pleasure and profit to offenders who, to this day, watch and share on internet platforms and storage services which continuously make money from the offenders that use them.

This is how the sexual abuse we suffered has been monetized by the internet. There are no parental controls that will address our trauma when parents, family members, and trusted family friends are our abusers. Pedophiles will not use reporting features to flag the abuse they are eagerly consuming. Take It Down tools do not help us because we never had ownership over the imagery of our abuse and did not consent to its creation. Forcing child predators to seek out encrypted spaces by choosing to only regulate unencrypted spaces will not help us or the countless current child victims and survivors of CSAM.

It is not outrageous for governments to mandate that tech companies use currently available technology that detects and removes known CSAM while preserving privacy for law abiding citizens on privately encrypted spaces. It is perhaps even less outrageous of a demand when we think of the scanning tech already does on encrypted and unencrypted spaces to protect consumers from malicious links and viruses. Tech companies are already scanning private messages for problematic malware and viruses – yet they refuse to scan for hash values to remove images of children being raped and tortured.

Tech companies have been given generous time and opportunities to discover their own moral obligation to children and act voluntarily. Four years after the Voluntary Principles, we continue to see that tech will not do what is right for children, and if we continue to wait for them to find their moral obligations, history will rightfully judge us harshly for the ways in which we failed to protect our most vulnerable.

Governments and regulators cannot allow tech to define their participation in legislation. We have seen that when tech companies are given the option, they will simply escape regulation by enacting encryption, taking advantage of provided loopholes, and doing the bare minimum required to check off a box. Tech companies are being allowed to influence how regulation is written and implemented – these are the very same companies that chose not to voluntarily regulate themselves, yet they are still treated as if they have the best interests of children in mind.

The Phoenix 11 acknowledges and commends action regarding the passing of the Online Safety Act in the United Kingdom, the introduction of the Online Harms Bill in Canada, and the ongoing work of the eSafety Office in Australia. However, we find that the issues we explored in this statement are consistently problematic in the writing of specific regulation as it applies to victims and survivors of CSAM. This includes lack of regulation for encrypted spaces, recommendations and expectations for tech companies that are voluntary, and allowing tech companies to define their own participation in how they will apply regulation to their platforms, apps, and storage services. In the United States we continue to find that the legislation which most are eager to enact caters to tech companies and their preference to put onus on the minor user and protective parents, which - more often than not - does not apply to victims and survivors of CSAM. There is pending legislation in New Zealand that we are following with hopes that victims and survivors of CSAM will be thought of along with other victims of online harms when it is passed.

It is our belief that until governments on a global scale stop deferring to tech companies and their preferences when looking to regulate the internet, current child victims and survivors of child sexual abuse material will continue to be the population that is overlooked.

Signed,

A group of 11 survivors of child sexual abuse who have banded together to challenge inadequate responses to the prevalence of child sexual abuse images on the internet



Media contact:
1 (204) 560-0723
communications@protectchildren.ca

-30-

About the Phoenix 11: The Phoenix 11 is a group of survivors whose child sexual abuse was recorded, and in the majority of cases, distributed online. This group has banded together as a powerful force to challenge the inadequate responses to the prevalence of child sexual abuse images online.

About the Canadian Centre for Child Protection: The Canadian Centre for Child Protection (C3P) is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child serving organizations, law enforcement, and other parties. C3P also operates Cybertip.ca, Canada’s national tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of CSAM on the clear and dark web and issue removal notices to industry.

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now