Skip to main content Skip to section navigation

Details on reports related to MindGeek-owned websites presented to federal MP committee

For Immediate Release

Winnipeg, Canada — Details on reports processed by the Canadian Centre for Child Protection (C3P) relating to MindGeek-owned websites were presented today to the Parliamentary Standing Committee on Access to Information, Privacy and Ethics.

Invited by the committee to provide testimony on the state of safety and privacy for children online, C3P representatives outlined the many ways a regulatory vacuum and lack of a coordinated response in the digital space is harming, not only children, but also non-consenting adults.

C3P reported that through the operation of — Canada’s tipline for reporting online child sexual abuse and exploitation— several reports related to MindGeek-owned websites were actioned between 2015 and 2020. These included incidents involving child sexual abuse material (CSAM) on their platforms, as well as incidents of sexual exploitation or abuse.

Over that timeframe, 90 processed reports were forwarded directly to law enforcement and in 334 cases a notice was issued to an electronic service provider.

C3P Executive Director Lianna McDonald told committee members platforms that have insufficiently-resourced moderation practices, coupled with a deluge of unverified users generating content, are especially problematic.

“While this committee’s focus to date has been largely been on the activities of MindGeek and their associated websites, it must be made clear that several mainstream companies operating websites, social media, email, and messaging services that most Canadians interact with daily could just as easily have been put under the microscope,” says McDonald. “We would never accept a standard where broadcasters or publishers were allowed to absolve themselves of responsibility for disseminating illegal child abuse images to the masses, and yet our collective inaction in the digital space has effectively done just that.”

C3P also recommended to the committee the creation of a legal framework that would compel electronic service providers to adopt the following practices:

  • Implement and use available tools to combat the flagrant and relentless re-uploading of illegal content;
  • hire, train, and effectively supervise staff to carry out moderation and content removal tasks at scale;
  • keep detailed records of user reports and responses that can be audited by authorities;
  • be accountable, from a legal perspective, for moderation and removal decisions and the harm that flows to individuals when companies fail in this capacity; and
  • build in, by design, features that prioritize the best interests and privacy rights of children.

WATCH: C3P live testimony at the Standing Committee on Access to Information, Privacy and Ethics

Media relations contact:
1 (204) 560-0723

About the Canadian Centre for Child Protection: C3P is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child-serving organizations, law enforcement, and other parties. C3P also operates, Canada’s national tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of CSAM on the clear and dark web and issue removal notices to industry.

About Project Arachnid: Processing tens of thousands of images per second, Project Arachnid detects content at a pace that far exceeds that of traditional methods of identifying and addressing this harmful material. Crawling the open web, the system determines that a particular link contains CSAM by comparing the media to a database of unique fingerprints of CSAM or abusive imagery that has been previously-categorized by trained analysts employed by C3P or equivalent organizations in other countries. Domestic and international law enforcement partners also contribute to this central repository. When a fingerprint match is detected, Project Arachnid automatically triggers the delivery of a notice of takedown to one or more ESPs involved in enabling the image to exist online. Since its release in 2017, Project Arachnid has detected more than 33 million suspect images of CSAM on the clear and dark web, and sent close to 6.7 million removal notices to content providers globally.


Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now