Image “facebook testify zuckerberg”, Author <a href=”http://www.thoughtcatalog.com” rel=”noreferrer nofollow”>www.thoughtcatalog.com</a> (CC BY-SA 2.0 Generic)
“Underlying every sexually explicit image or video of a child is abuse, rape, molestation, and/or exploitation. The production of CSAM creates a permanent record of the child’s victimization.”
-US Dept. of Justice [1A]
The US Dept. of Justice defines Child Sexual Abuse Material (CSAM) as any visual depiction of sexually explicit conduct involving a minor under the age of 18, in other words, child pornography [1B]. Visual depictions would include photos, videos, live streaming, and digital or computer generated images (including AI-generated content) indistinguishable from an actual minor [2].
Scope
Due to rapid technological changes, online child sexual exploitation is daily increasing in scale and complexity [1C]. The violence and sadistic content depicted in CSAM have increased as well [1D].
Child pornography is readily available through virtually every internet technology, including social networking platforms, file-sharing sites, gaming devices, and mobile apps [1E].
On the Dark Web, where anonymity and encryption make it more difficult to trace perpetrators, a single active website dedicated to the sexual abuse of children had over 2.5 million registered users as of 2021 [1F]. The same year the National Center for Missing & Exploited Children (NCMEC) received nearly 30 million reports of suspected online child exploitation [1G].
Smartphones
Modern smartphones are the ideal child exploitation tool for offenders [1H]. Smartphones can be used to photograph, record, or watch live child sexual abuse. They can both store child pornography, and access such stored pornography remotely. They can connect to other offenders worldwide to receive or distribute child pornography, through an ever expanding variety of apps. Continue reading










