An Investigation into Self-Generated Child Sexual Abuse Material Networks on Social Media
Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.
Large Networks of Minors Appear to be Selling Illicit Sexual Content Online
The Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.
A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.
With only public data, this research uncovered and helped resolve basic safety failings with Instagram’s reporting system for accounts with expected child exploitation, and Twitter’s system for automatically detecting and removing known CSAM.
Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.
Front-Page Wall Street Journal Coverage
Bipartisan Concern and Calls for Social Media Regulation
The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.
In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
Create your
podcast in
minutes
It is Free