Tag Archives: Google

CSAM Apps – Monetizing the Sexual Exploitation of Children Online, Part 2

The sale of Child Sexual Abuse Material (CSAM) has spawned a highly lucrative industry. Social media platforms have overlooked this in the interest of profit.  An internal survey conducted at Meta indicated 13% of users between the ages of 13 and 15 receive at least one sexual solicitation per week [1A]. 

In 2023 NHK News partnered with Tansa to investigate Album Collection (a photos sharing app available in Japan that once topped Apple’s App Store in sales) [1B].  Album Collection was self-described as a tool for sharing family memories and graduation trip photos.  In fact, it offered over a thousand items of child pornography for sale. 

Identifying the true owner/operator of Album Collection proved extremely difficult [2].  But the money trail leading to social media platforms like Apple and Google was clear [1C]. 

An item of CSAM was posted to the app, and assigned a password which could be obtained for a price.  Viewers of the app were then provided a key for $1.10.  The original poster received $0.10 of this amount.  The app owner/operator received the remaining $1.  However, fully $0.30 of that $1 was routed to the social media platform. 

Other CSAM apps operate in similar fashion. Assuming the financial arrangement is comparable, some 30% of the billions in profits from child pornography go directly to social media platforms like Meta.  Meanwhile, when one CSAM app is taken down, others rapidly take its place. Continue reading

3 Comments

Filed under Abuse of Power, Child Abuse, Child Molestation, Emotional Abuse, Justice, Law, Neglect, Physical Abuse, sex trafficking, Sexual Abuse, Sexual Assault

CSAM Apps – Monetizing the Sexual Exploitation of Children Online, Part 1

Image “facebook testify zuckerberg”, Author <a href=”http://www.thoughtcatalog.com” rel=”noreferrer nofollow”>www.thoughtcatalog.com</a> (CC BY-SA 2.0 Generic)

“Underlying every sexually explicit image or video of a child is abuse, rape, molestation, and/or exploitation. The production of CSAM creates a permanent record of the child’s victimization.”

-US Dept. of Justice [1A]

The US Dept. of Justice defines Child Sexual Abuse Material (CSAM) as any visual depiction of sexually explicit conduct involving a minor under the age of 18, in other words, child pornography [1B].  Visual depictions would include photos, videos, live streaming, and digital or computer generated images (including AI-generated content) indistinguishable from an actual minor [2].

Scope 

Due to rapid technological changes, online child sexual exploitation is daily increasing in scale and complexity [1C].  The violence and sadistic content depicted in CSAM have increased as well [1D].

Child pornography is readily available through virtually every internet technology, including social networking platforms, file-sharing sites, gaming devices, and mobile apps [1E]. 

On the Dark Web, where anonymity and encryption make it more difficult to trace perpetrators, a single active website dedicated to the sexual abuse of children had over 2.5 million registered users as of 2021 [1F].  The same year the National Center for Missing & Exploited Children (NCMEC) received nearly 30 million reports of suspected online child exploitation [1G]. 

Smartphones

Modern smartphones are the ideal child exploitation tool for offenders [1H].  Smartphones can be used to photograph, record, or watch live child sexual abuse.  They can both store child pornography, and access such stored pornography remotely.  They can connect to other offenders worldwide to receive or distribute child pornography, through an ever expanding variety of apps. Continue reading

17 Comments

Filed under Abuse of Power, Child Abuse, Child Molestation, Emotional Abuse, Neglect, Physical Abuse, sex trafficking, Sexual Abuse, Sexual Assault