Tell Big Tech Companies to Stop the Spread of Child Sexual Abuse Materials

0 have signed. Let’s get to 1,000!


Images of child sexual abuse have exploded with the increased use of the internet. The National Center for Missing & Exploited Children (NCMEC) in the U.S. reviewed 450,000 files in 2004, 45 million in 2018, and 70 million in 2019, an average growth rate of more than 1000% per year!

thorn.org reports that "The statistical breakdown of the kind of abuse involved in the images submitted to NCMEC’s Child Victim Identification Program shows that 76% of the series collected contained images depicting penetration, while 44% of the series contained images depicting bondage and/or sado-masochism." A 2016 report from the Canadian Centre for Child Protection found that:

- 78.29% of the images and videos assessed depicted very young, prepubescent children under 12 years old and nearly two-thirds of those children appeared to be under 8 years of age.
- 6.65% of those children under 8 years old appeared to be babies or toddlers.
- 77.05% of the children’s faces were visible in the images and videos.
- 53.84% of the abuse acts against children under 12 years old involved explicit sexual activity/assaults and extreme sexual assaults .
- 59.72% of the abuse acts against babies and toddlers involved explicit sexual activity/assaults and extreme sexual assaults.
- 68.68% of the images and videos appeared to be in a home setting, of which  69.91% captured explicit sexual activity/assaults and extreme sexual assaults.
- 83.35% of the adults visible in the images and videos were males.
- 97.25% of the content involved explicit sexual activity/assaults and extreme sexual assaults when adult males were visible with the children in the images and video.

Big Tech companies are not required by law to search, report or remove child sexual abuse material (CSAM). And because it is expensive, they don't. In fact, only Facebook has been scanning pictures and files for CSAM, reporting 18 million in 2018. But, that will stop with the implementation of encrypted messaging to  address people's privacy concerns. 

These files exist on and are spread through Windows, iOS, Android and other operating systems. Images can be shared and viewed over and over again in perpetuity, continuously traumatizing the victims.

Tell Big Tech companies to invest in stopping the spread of CSAM. It is far more important than the protection of people's privacy. Tell governments to regulate Big Tech firms, if that is what it takes to stop this horror.