CSAM Apps – Monetizing the Sexual Exploitation of Children Online, Part 2

The sale of Child Sexual Abuse Material (CSAM) has spawned a highly lucrative industry. Social media platforms have overlooked this in the interest of profit.  An internal survey conducted at Meta indicated 13% of users between the ages of 13 and 15 receive at least one sexual solicitation per week [1A]. 

In 2023 NHK News partnered with Tansa to investigate Album Collection (a photos sharing app available in Japan that once topped Apple’s App Store in sales) [1B].  Album Collection was self-described as a tool for sharing family memories and graduation trip photos.  In fact, it offered over a thousand items of child pornography for sale. 

Identifying the true owner/operator of Album Collection proved extremely difficult [2].  But the money trail leading to social media platforms like Apple and Google was clear [1C]. 

An item of CSAM was posted to the app, and assigned a password which could be obtained for a price.  Viewers of the app were then provided a key for $1.10.  The original poster received $0.10 of this amount.  The app owner/operator received the remaining $1.  However, fully $0.30 of that $1 was routed to the social media platform. 

Other CSAM apps operate in similar fashion. Assuming the financial arrangement is comparable, some 30% of the billions in profits from child pornography go directly to social media platforms like Meta.  Meanwhile, when one CSAM app is taken down, others rapidly take its place.

Regulation

The Wikimedia Foundation (which hosts Wikipedia and Wikimedia Commons) has a policy for combating online child sexual exploitation which has come under fire for potentially infringing on free speech [3][4].  Google likewise has a policy [5].  But many social media platforms either do not have such policies, or do little or nothing to enforce them.  Self-regulation seems, therefore, to have failed.

In January 2024, top executives of social media platforms including Meta, TikTok, and X testified about child sexual exploitation before the US Senate, as a consequence of child suicides [6].  They disclaimed any responsibility, and declined to set up a victims compensation fund. 

Both Britain and the EU now require social media platforms to take steps to delete CSAM, imposing severe penalties for violations.  Thus far, the United States has done nothing to protect children from exploitation by these media giants.

Profits are obviously more important than our children.

[1A, 1B, and 1C]  NHK News Web, “Sexual Exploitation of Minors, Part 2:  The International CSAM Trade – Documentary 360/Innovative Investigations”, 0/28/24, https://www3.nhk.or.jp/nhkworld/en/shows/4001451/.

[2]  Tansa, “Pursuing Album Collection’s operators:  ‘He probably refunds the transactions he finds to be illegal’ (24)”, 9/22/24, https://en.tansajp.org/investigativejournal/8447/.

[3]  Wikimedia Foundation, “Wikimedia Foundation Combating Online Child Exploitation Policy”, https://foundation.wikimedia.org/wiki/Policy:Wikimedia_Foundation_Combating_Online_Child_Exploitation_Policy.

[4]  BBC, “Wikimedia pornography row deepens as Wales cedes rights” by , 5/10/10, https://www.bbc.com/news/10104946.

[5]  Google Blog, “How we detect, remove and report child sexual abuse material”, 10/28/22, https://blog.google/technology/safety-security/how-we-detect-remove-and-report-child-sexual-abuse-material.

[6]  Associated Press (AP), “Meta, TikTok, and social media CEOs testify in heated Senate hearing on child exploitation” by Barbara Ortutay and Haleluya Hadero, 1/31/24,   https://apnews.com/article/meta-tiktok-snap-discord-zuckerberg-testify-senate-00754a6bea92aaad62585ed55f219932.

Part 1 in this series was posted last week.

FOR MORE OF MY ARTICLES ON POVERTY, POLITICS, AND MATTERS OF CONSCIENCE CHECK OUT MY BLOG A LAWYER’S PRAYERS AT: https://alawyersprayers.com

 

3 Comments

Filed under Abuse of Power, Child Abuse, Child Molestation, Emotional Abuse, Justice, Law, Neglect, Physical Abuse, sex trafficking, Sexual Abuse, Sexual Assault

3 responses to “CSAM Apps – Monetizing the Sexual Exploitation of Children Online, Part 2

  1. The idea that a platform would willingly profit from this kind of activity is as vile as it gets. If the average person can be arrested for similar distribution of this content–and rightfully so–then that should apply as well to those that seem ok with proliferating that same type of content. Goodness……this is a long way from controversial. It’s disheartening to hear how far behind we are. Thank you for the education, Anna.

    –Scott

  2. ”For the love of money is the root of all evil”.

    I am quite certain that our founders did not have child pornographers in mind when discussing the merits of free speech. That the heads of the major social media platforms absolve themselves of responsibility is appalling!

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.