Skip to main content Skip to section navigation

Press & Media: Past blogs

Finding of TikTok child privacy violations by federal, provincial privacy authorities an important building block toward tech accountability in Canada

Last week, a joint investigation published by the Privacy Commissioner of Canada and three provincial privacy authorities found that TikTok, a popular social media platform with over 14 million Canadian users, violated federal and provincial privacy laws, notably in relation to children who use their online service.

CSAM distribution on Tor is not inevitable; The network’s creators have the power to act

Why has the Canadian Centre for Child Protection been notifying the organization behind the “Dark web” of the mass distribution of child sexual abuse material on their network?

Legislated age assurance requirement needed to ensure regulated services fulfil their child specific duties under proposed Online Harms Act

This briefing note is to inform readers about the Canadian Centre for Child Protection’s (C3Ps) view that the absence of any provision in Canada’s proposed Online Harms Act (Bill C-63) related to age assurance requirements casts doubt on the ability of regulated online service providers to fulfil their child-user specific duties, creating a significant gap in the effectiveness of the proposed online safety regime. C3P recommends amendments to the draft Online Harms Act to include age assurance provisions or for the government to clarify how it expects such measures would be included in regulation.

Exclusion of private messaging features from proposed Online Harms Act leaves a substantial threat to children unaddressed

This briefing note is to inform readers about the Canadian Centre for Child Protection’s view that Canada’s proposed Online Harms Act (Bill C-63) ought to be amended to ensure private messaging services and certain aspects of private messaging features be subject to regulation, given that a significant amount of harm experienced by children occurs in these exact digital environments.

Rinse and repeat: On-going failure of social media giants to prioritize child safety documented by latest Australian tech regulator report

From the vantage point of a child protection agency, it is impossible to grasp the full scope of safety failures on the social media platforms that our kids and teens use every day.

Claims by academic group calling for EU to abandon CSAM-blocking policies don’t stand up to real-world scrutiny

Last week, an international group of academics called for the EU to abandon its pursuit of regulatory measures that would require tech companies to make efforts to detect the distribution of child sexual abuse material (CSAM) and attempts by offenders to sexually groom children on their platforms.

Sunlight is the best disinfectant: Australia’s eSafety Commissioner report names names in tech safety report

As we continue to peel back the layers of harm to children happening online, the chorus of calls for more transparency and accountability from the technology industry continues to grow louder. But there’s a problem.

What we’ve learned operating Cybertip.ca for two decades

When the tipline first launched in September 2002, little was known about the scale or nature of the online victimization of children. We could never have imagined the degree, methods and speed by which children would be accessed and injured through the use of technology. Nor did we forecast the wraparound harms that resulted from an unregulated internet, where those looking to harm children faced no accountability or consequence.

Amanda Todd’s story foreshadowed the harms that today’s kids face. Why did we fail to act?

By the spring of 2013, the tragic deaths of four young Canadian girls over the previous three years – Amanda Todd, Rehtaeh Parsons, Jenna Bowers‑Bryanton, and Kimberly Proctor – had thrust the realities of online harm and violence into our consciousness. Their lives were all cut short, in part due to a common threat: technology and social media.

Blog: Dehumanizing tech response to New York Times investigation on suicide-themed website shows industry’s true colours

As the end of 2021 nears, U.S. lawmakers are yet again seeking answers from technology companies over another form of online harm festering on the web.

Blog: New Twitter policy elevating right to privacy a pivotal shift toward online safety

It was an unexpected shift in the content moderation doctrine of one of the largest social media platforms in the world.

Blog: Apple must stand by its image detection safety plan, ignore privacy zealot push for online status quo

When your life’s work is devoted to improving the safety of children online, you can’t help but greet technology companies’ grand announcements and promises with a great deal of skepticism.

Blog: “Nirvana Baby” lawsuit underscores broader issue of adults robbing children of their privacy rights

Spencer Elden, the man whose nude infanthood photograph was controversially used as Nirvana’s 1991 Nevermind album cover, is now suing the band over the image. This should serve as a wake up call for us all.

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now