The sale of Child Sexual Abuse Material (CSAM) has spawned a highly lucrative industry. Social media platforms have overlooked this in the interest of profit. An internal survey conducted at Meta indicated 13% of users between the ages of 13 and 15 receive at least one sexual solicitation per week [1A].
In 2023 NHK News partnered with Tansa to investigate Album Collection (a photos sharing app available in Japan that once topped Apple’s App Store in sales) [1B]. Album Collection was self-described as a tool for sharing family memories and graduation trip photos. In fact, it offered over a thousand items of child pornography for sale.
Identifying the true owner/operator of Album Collection proved extremely difficult [2]. But the money trail leading to social media platforms like Apple and Google was clear [1C].
An item of CSAM was posted to the app, and assigned a password which could be obtained for a price. Viewers of the app were then provided a key for $1.10. The original poster received $0.10 of this amount. The app owner/operator received the remaining $1. However, fully $0.30 of that $1 was routed to the social media platform.
Other CSAM apps operate in similar fashion. Assuming the financial arrangement is comparable, some 30% of the billions in profits from child pornography go directly to social media platforms like Meta. Meanwhile, when one CSAM app is taken down, others rapidly take its place. Continue reading

