Police agencies face an unprecedented challenge in combating child exploitation, halting the dissemination of child sexual abuse material (CSAM), and finding the predators responsible. The rapid advancement of technology combined with children’s increasing access to the internet facilitates an opportunity for perpetrators to operate with greater anonymity and reach a wider audience than ever before.
Despite Internet Crimes Against Children (ICAC) task forces working diligently to curb these crimes, they are severely underfunded. Raven, a nonprofit that advocates for more resources to support ICAC investigations, is pushing the U.S. Congress to—at a minimum—appropriate the amount of money it has already authorized.
According to John Pizzuro, Raven’s CEO and former New Jersey ICAC commander, the ICAC task forces investigating these cases have received only $32.9 million in appropriations, despite lawmakers authorizing $60 million in annual funding through the Protect Our Children Act of 2008. In 2008, the average number of internet-capable devices per household was 1— today, the average sits at 21.5. However, the funding model simply isn’t keeping pace with the increasing volume of cyber tips and investigations.1
The landscape of online exploitation continues to evolve primarily through predators’ use of encrypted apps, presenting more hurdles for investigators. In this complex and dynamic environment, traditional methods of detection and intervention have proven insufficient, underscoring the urgent need for innovative digital investigative solutions and systemic reforms.
Amid these pressing concerns, the May 2024 passing of the REPORT Act offers a glimmer of hope. This landmark bipartisan legislation aims to revolutionize the fight against CSAM by empowering victims and police, streamlining reporting processes, and equipping investigators with the tools needed to dismantle exploitation networks.2
The REPORT Act: A Beacon of Hope
Not only does the REPORT Act improve the reporting process, which is one of the biggest hurdles in investigating CSAM; it also pushes for modernization and enhancement of police investigative capabilities. With the act’s implementation, transformative shifts will take place.
Required Reporting
Online platforms are now required to report ICAC to the National Center for Missing and Exploited Children (NCMEC), including child sex trafficking, online enticement, and grooming. If they fail to report, they face stiff penalties. The companies must also hold onto the content for one year, thus allowing police sufficient time to review a case. Prior to this, platforms had to retain the content for only 90 days.
Enhanced Support for Victims
With the ability for child victims and their representatives to directly report instances of CSAM to NCMEC, the REPORT Act ensures victims receive timely support and assistance. Without accessible and confidential avenues for reporting, victims and their representatives were previously left without recourse, depriving them of the support and resources needed to aid in their recovery and healing process. This failure to provide comprehensive support not only perpetuated the cycle of victimization but also undermined efforts to identify and intervene in cases of child exploitation. The REPORT Act will enable quicker interventions, access to necessary services, and a greater sense of agency for victims and their families navigating the aftermath of exploitation.
Improved Police Response
Police agencies can now submit and manage CSAM in cloud environments, instead of using snail mail, allowing for swifter victim identification. By streamlining reporting processes, leveraging technology for more efficient investigations, and holding online platforms accountable for reporting crimes, the REPORT Act supports police agencies with the tools and resources needed to pursue offenders and bring them to justice.
Advancements in Technology and Collaboration
The implementation of the REPORT Act is likely to spur advancements in technology and foster greater collaboration between police agencies, technology providers, and advocacy organizations. As the policing professional adapts to the new reporting and investigative landscape, stakeholders must innovate and develop new tools and strategies to stay ahead of evolving threats. This collaborative approach is likely to lead to ongoing improvements in the detection, prevention, and prosecution of online child sexual exploitation cases.
Deterrence and Prevention
The increased accountability and stronger investigative capabilities afforded by the REPORT Act have the potential to act as deterrents to would-be offenders. By creating a more hostile environment for those seeking to exploit children online, the legislation may discourage potential predators from engaging in illegal activities related to exploitation and CSAM. Additionally, the improved identification and reporting of CSAM can facilitate proactive measures to prevent further harm to vulnerable children and disrupt networks of exploitation before they can escalate.
The Role of Industry Partnerships in Serving the Cause
While advancements like the REPORT Act offer hope, exploitation extends beyond digital platforms. The Exodus Road, a nonprofit organization, works tirelessly to dismantle human trafficking networks globally through prevention, intervention, and aftercare programs. In the first three months of 2024 alone, The Exodus Road, in collaboration with the police, made significant strides in rescuing trafficking survivors and bringing perpetrators to justice.
In the Philippines, Thailand, and Brazil, The Exodus Road utilized Cellebrite’s cutting-edge digital forensics solutions to identify and locate victims, leading to the successful freeing of 32 survivors—most of them minors—and the apprehension of 11 perpetrators.3 These operations not only highlighted the indispensable role of technology in police efforts but also underscored the importance of collaboration between organizations and across borders. The impact of these operations extends far beyond the numbers, offering hope to survivors and sending a clear message to perpetrators.
To help accelerate the investigation of these crimes, Cellebrite, NCMEC, The Exodus Road, and Raven, launched a groundbreaking public commitment campaign called Operation Find Them All (OFTA). This campaign seeks to equip and empower the police with the most advanced digital investigative solutions, innovative tools, and comprehensive training to confront this crisis head-on.
A Looming Shadow
The REPORT Act and OFTA both encourage advancements in the ongoing battle against child exploitation and the dissemination of CSAM. However, despite this momentum, a new challenge looms on the horizon: the emergence of AI-generated CSAM. Recent findings underscore the gravity of this threat, with a report by the Internet Watch Foundation finding more than 20,000 AI-generated CSAM images on a single dark web forum in just one month, while NCMEC received nearly 5,000 reports related to CSAM involving generative AI technology in 2023.4
Tackling the AI Challenges
AI-generated content poses unique challenges for detection and intervention efforts. With AI algorithms capable of generating highly realistic and convincing imagery, distinguishing between genuine and manipulated content is increasingly difficult, if not impossible, for traditional detection methods. Legislation always chases technology, and, in this instance, the victims and those seeking to protect them cannot wait.
The U.S. federal government already sent a strong signal with the arrest of a 42-year-old software engineer in Wisconsin, who is accused of using Stable Diffusion, a generative AI tool, to create thousands of realistic imageries of minors. He then allegedly distributed these across various social media platforms and chat apps. Investigators allege the perpetrator entered extremely specific prompts that prevented the generative AI from generating adult imagery—evidence that bolsters the case against him. He is now in federal custody, charged with production, distribution, and possession of AI-generated CSAM, as well as sending obscene material to a minor.5 With this case, the U.S. Department of Justice is sending a clear message: CSAM, even if produced with AI, is illegal.
The use of AI technology enables perpetrators to create and disseminate CSAM at scale and speed, magnifying the challenges faced by police agencies and online platforms in identifying and removing illicit content. As AI continues to evolve, so too must the approaches to combating CSAM, with an emphasis on leveraging technological advancements in investigations and pushing for legislation to hold bad actors accountable.
The Evolving Landscape of CSAM Detection
The emergence of AI-generated CSAM highlights the need for a multipronged approach to detection. Traditional methods, often relying on identifying known victims or patterns of abuse, may not work with AI-manipulated content; however, technology and collaboration provide opportunities to combat this issue.
- AI-Powered Detection: Ironically, AI itself can combat AI-generated CSAM. Researchers and technologists can train AI systems to identify specific markers and patterns associated with AI-manipulated content. By analyzing subtle discrepancies and anomalies in digital media, AI algorithms can flag potentially illicit material for further investigation, augmenting the capabilities of human moderators and police agencies.
- Research on Advanced Tools: The industry is in the early stages of forensic research into AI-generated content—yet those leading the effort are confident there will be markers and characteristics in a file that show it was created by AI via file paths and apps used prior to the content creation. Eventually, forensic investigators will be able to leverage AI-powered algorithms to uncover hidden layers of information within digital media, enabling more effective attribution and, therefore, the prosecution of offenders.
- International Collaboration: Global cooperation among police agencies, technology companies, and civil society organizations is crucial for addressing transnational CSAM dissemination. By sharing information, resources, and best practices, stakeholders can develop unified detection strategies and coordinate efforts to combat AI-generated CSAM across borders. International collaboration also plays a vital role in tracking perpetrators, dismantling distribution networks, and ensuring swift and effective enforcement actions against offenders.
Lean On Education, Awareness, and Training
While legislative measures and technological advancements play a crucial role in addressing CSAM, education and specialized training initiatives are equally essential in preventing its proliferation and protecting children from exploitation.
- Identification Skills: Comprehensive training can help professionals recognize the nuanced signs of child exploitation and assess the risk factors associated with CSAM, ensuring early intervention and prevention of further harm.
- Sensitive Response: Police personnel must be trained to respond to disclosures of abuse with empathy and understanding, creating a supportive environment where survivors feel safe to share their experiences. This includes following established protocols for reporting suspected cases of abuse and ensuring victims receive immediate protection and support.
- Trauma-Informed Care: Frontline professionals should be equipped with the knowledge and skills needed to provide trauma-informed care to survivors, recognizing the impact of trauma on their mental, emotional, and physical well-being. This involves employing techniques to minimize re-traumatization during interviews and examinations and connecting survivors with appropriate support services and resources to aid in their healing and recovery.
- Interdisciplinary Collaboration: Collaboration and coordination among various disciplines, including police, social services, health care, education, and child protection agencies, is important to the success of these investigations. Working together, frontline professionals can leverage their respective expertise and resources to ensure that survivors receive holistic care and support, addressing their diverse needs and facilitating their journey toward healing and resilience.
A First Step
The REPORT Act emerges as a pivotal milestone in the ongoing battle against child exploitation and the dissemination of CSAM. By addressing systemic shortcomings and empowering police agencies with enhanced reporting mechanisms and investigative tools, it’s a step in the right direction for safeguarding vulnerable children and holding perpetrators accountable.
Demonstrated by recent advancements and collaborative efforts, including those of organizations like The Exodus Road and Raven, programs like OFTA, and technology providers like Cellebrite, there is reason for optimism. By leveraging AI for detection, advancing forensic tools, and fostering global cooperation, stakeholders can adapt to the evolving landscape of CSAM and strive toward a safer future for all children. d
Notes:
1Cellebrite, “Addressing the Challenges of the ICAC Task Forces: Funding, Education and Legislative Reforms,” Operation Find Them All Blog, May 30, 2024.
2REPORT Act, Pub. L. No. 118-59, 138 Stat. 1014 (2024).
3Cellebrite, “The Exodus Road Assists Law Enforcement in Freeing 32 Human Trafficking Survivors, Mostly Minors, with Cellebrite’s Support,” Operation Find Them All Blog, May 23, 2024.
4International Watch Foundation, “Artificial Intelligence (AI) and the Production of Child Sexual Abuse Imagery”; National Center for Missing and Exploited Children (NCMEC), “Generative AI CSAM Is CSAM,” NCMEC Blog, March 11, 2024.
5U.S. Department of Justice, “Man Arrested for Producing, Distributing, and Possessing AI-Generated Images of Minors Engaged in Sexually Explicit Content,” press release, May 20, 2024.
Please cite as
Heather Barnhart, “A Brighter Future in the Fight Against Child Sexual Abuse Material,” Police Chief Online, December 11, 2024.