IACP@Work: Law Enforcement Cyber Center Gears Up for Enhanced AI Resources

Technology has rapidly become a part of most people’s daily lives, impacting everything from air travel to vehicles, homes, and workplaces, introducing automation that shapes people’s routines.

The advent of autonomy further underscores the pervasive and many times discreet nature of these technological shifts, compelling police professionals to adapt and navigate this complex landscape. As these technologies progress, the education and training of police professionals must also advance to keep pace with these new developments.

Recognizing the critical need for awareness, education, and information on how to handle these emerging technology trends, the Law Enforcement Cyber Center (LECC) stands as a crucial resource. Funded by the Bureau of Justice Assistance (BJA) and managed by the National White Collar Crime Center (NW3C), the International Association of Chiefs of Police (IACP), and the Police Executive Research Forum (PERF), this online platform is dedicated to providing comprehensive support for police officers, prosecutors, judges, and other criminal justice professionals.

The LECC is a free resource, offering a wealth of knowledge, educational tools, and information tailored to equip police personnel with the necessary skills to navigate the complexities of cybercrime investigations, prosecutions, and modern technology. The LECC also acts as a pointer system, enabling police, prosecutors, judges, and interested criminal justice professionals to quickly locate training programs, conferences, and resources focused on cyber-related topics. During the 2023 IACP Annual Conference and Exposition in San Diego, California, thought leaders from across the United States convened a roundtable meeting to deliberate on ways to improve the LECC and identify the pertinent topics that should be incorporated moving forward.

One of the key recommendations from the group was the addition of resources related to artificial intelligence (AI) and providing awareness on how individuals are increasingly leveraging this technology to enhance the effectiveness and sophistication of their illicit activities. AI technology has the capability to influence society by utilizing algorithms and having a vast processing capacity. AI can scrutinize previously formidable amounts of user data to provide customized and synthetic content. Furthermore, the advancement of “deepfake” technology, driven by AI, allows for the rapid creation of incredibly realistic fake videos or audio recordings. This enables malicious individuals to impersonate victims’ family members, friends, and public figures, facilitating the dissemination of credible looking yet false information used not only to fool people but also to scam funds from unsuspecting individuals who believe what they are see or hear. The recent incident involving Taylor Swift and the false depiction of her endorsing specific politicians serves as just one instance highlighting how criminals are exploiting this technology in an attempt to sway political opinions and policies.1 Additionally, chatbots employing AI engage users in conversations that closely mimic human interactions and can contribute to the spread of propaganda and disinformation. As these techniques become more advanced, traditional methods of fraud detection may become less effective. Investigating such cases involves a combination of traditional investigative techniques, strategic partnerships with private industry, and leveraging specialized knowledge in the field of digital forensics related to AI.

Now, more than ever, police must actively embrace this rapidly changing technology, be open to trying new approaches, and stay updated on the latest AI-generated fraud techniques to effectively combat emerging challenges. Another critical component of this effort is raising public awareness about the risks associated with AI-generated fraud and encouraging individuals to report suspicious activity to the police. AI-generated fraud is dynamic, and police strategies continue to evolve to address emerging threats.

The roundtable participants also suggested it is essential for police leaders to gain a more comprehensive understanding of this emerging technology as they develop policy and guidelines. AI holds the capability to revolutionize the methods by which the police prevent, investigate, resolve, and prosecute crimes. Nonetheless, despite the numerous potential advantages of AI, there exists public apprehension regarding its potential misuse by police.2 To foster and uphold trust between the police and the public they serve, departments need to be thoughtful about the policies on how they both use and restrict use of this technology. What quality management guardrails are in place to constrain AI use? How long are images and other data saved? Who can look up a car’s location? How will the public learn about the technology’s use?

Once policies are created and instituted, using the technology can help reduce crime by increasing the likelihood and speed of apprehending criminals.3 It is imperative to thoroughly comprehend and alleviate these concerns during the development and deployment of any AI capability to avoid community misperception that AI is a scary unbridled “black box” technology.

Based on the recommendations from the aforementioned roundtable, the LECC is undergoing a significant transformation in 2024. The website will be updated, and new topic areas and sections will be introduced. In line with these changes, a dedicated section on AI will be added to provide resources, education, and training options specifically tailored for police personnel, prosecutors, and criminal justice professionals. These resources will also include information on public awareness and responsible use of AI.

As the 2024 IACP Technology Conference in Charlotte, North Carolina, draws near, participants from the roundtable meeting will have the opportunity to review the updated LECC resource. This offers them a chance to provide any final feedback prior to the changes being implemented and the site being updated.

With AI advancing at a rapid pace, it is crucial for police agencies, prosecutors, and all criminal justice stakeholders to have the tools and resources to address this technology. This ensures not only the proactive utilization of AI to solve crimes and enhance community safety but also the capability to react and investigate offenses involving AI. The LECC is committed to furnishing the necessary resources and tools to navigate and progress with future technologies to build safer communities by reducing crime.d

Notes:

1Kat Tenbarge, “Taylor Swift Deepfakes on X Falsely Depict Her Supporting Trump,” NBC News, February 7, 2024.

2Niraj Bhargava, Joe Oliver, and Mardi Witzel, “AI Tools and Public Trust: You Can Have Both,” Tech Talk, Police Chief  90, no. 1 (January 2023): 52–53.

3Frank Chen, “Exploring AI for Law Enforcement: Insight from an Emerging Tech Expert,” interviewed by Chris Hsiung, Police Chief Online, September 20, 2023.


Please cite as

Jeff Lybarger, Jim Emerson, and Mike Fergus, “Law Enforcement Cyber Center Gears Up for Enhanced AI Resources,” IACP@Work, Police Chief 91, no. 4 (April 2024): 104–105.