
Abstract
Artificial intelligence (AI) is increasingly used in the criminal justice system, including crime prediction, sentencing, and victim services. While AI tools offer benefits—such as identifying intimate partner violence and child abuse—scholars caution against bias, lack of transparency, and victim-blaming, particularly in apps and risk assessment models. Building on these insights, a new pedagogical model was developed to train future victim advocates using AI simulations. Through platforms like Boodlebox, students engage in structured, trauma-informed dialogues with AI chatbots such as ChatGPT and “Ruth” from the National Domestic Violence Hotline. The curriculum focuses on experiential learning, client-centered practice, and ethical engagement with technology. Combining both the evidence-based critique and hands-on learning, this model prepares students to navigate AI-integrated victim services with care, ensuring advocacy remains trauma-informed, responsive, and grounded in human dignity.
Department/Program
Criminal Justice
Submission Type
online only poster
Date
3-21-2025
Rights
Copyright the Author(s)
Recommended Citation
Qi, Ziwei
(2025)
"AI MEETS ADVOCACY: A PILOT STUDY ON TEACHING VICTIM ADVOCACY USING BOODLEBOX CHATBOT,"
SACAD: Scholarly Activities: Vol. 2025, Article 103.
Available at:
https://scholars.fhsu.edu/sacad/vol2025/iss2025/103