“In another life, we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
If you wanted to design the perfect magnanimous and sympathetic victim of a crime, you could do a lot worse than building around a quote like this. However, this statement was not made by an actual person, but by an AI simulacrum of a manslaughter victim. In a first for a U.S. legal proceeding, an AI-generated victim impact statement from a deceased victim was recently introduced in court—at the sentencing hearing of the person who killed him.
The case, which unfolded in the courtroom of Arizona state trial court Judge Todd Lang, started back in November 2021, when 50-old Gabriel Horcasitas honked at a car at a red light after the driver cut in front of him. The driver, 37-year-old Chris Pelkey, exited his vehicle, walked toward Horcasitas, and threw up his hands. Horcasitas then shot Pelkey in the chest, and Pelkey was pronounced dead at the hospital.
Horcasitas was tried and found guilty of reckless manslaughter. At Horcasitas’s sentencing hearing, Judge Lang—without providing notice to defense counsel—allowed Pelkey’s family to play an AI-generated video scripted and generated by Pelkey’s sister and brother-in-law, tech entrepreneurs who have invested in and helped develop AI companies. Afterwards, Lang said he “loved” the presentation, and thanked Pelkey’s family for providing it. “As angry as you are, and as justifiably angry as the family is, I heard the forgiveness.” Lang said. “I know Mr. Horcasitas could appreciate it, but so did I.”
Clip via YouTube
The use of AI in the law has already caused problems in courtrooms. Law firms have been sanctioned for submitting AI-generated legal briefs full of fictitious citations to made up law. Startups have proposed using AI-powered legal representation in courtrooms—and earned rebukes from judges for their unsanctioned use in court. AI has also caused significant headaches in the licensing of lawyers after the California Bar Association administered a bar exam partially written by AI and containing significant errors.
The use of AI-generated victim statements at sentencing hearings, though, takes things to a new level, reducing grief to spectacle. It allows families to preserve and project a version of their loved one that might play well in court, but which may not reflect the person’s wishes—or even reality. In the process, their use provides victims with a powerful tool to influence sentencing at the risk of violating defendants’ rights. By allowing family members to script idealized presentations of victims and portray them as if they accurately reflect dead victims’ thoughts and feelings, courts risk allowing significant bias against defendants.
Judge Lang’s statement to Pelkey’s family highlights how easily an AI VIS can be used in a manipulative way. Members of Pelkey’s family, including his sister who was involved in the AI statement, offered their own statements calling for the maximum sentence, but then the AI version of Pelkey called for forgiveness. This contrived display of magnanimity helped Pelkey’s family get their way: Judge Lang handed down the maximum sentence of ten and a half years, even though the prosecution requested only nine. “You demanded the maximum sentence. And even though that’s what you wanted, you allowed Chris to speak from his heart as you saw it,” Lang said. “I didn’t hear him asking for the maximum sentence.”
Clip via YouTube
Victim impact statements, which debuted in California in the 1970s, are a way for victims or their families to tell the court how a crime affected them. The movement behind these statements was driven by an unusual coalition of women’s rights advocates and civil rights activists, who sought to allow the victims of race- and gender-based crimes to have agency in the resolution of cases affecting them, and tough-on-crime conservatives, who were all too happy to capitalize on this movement to put people away for longer.
In Payne v. Tennessee in 1991, the Supreme Court upheld the constitutionality of VISs, overturning a decision from just four years earlier. This shift was less about a change in legal reasoning and more about the composition of the Court, after Justices Anthony Kennedy and David Souter replaced Lewis Powell and William Brennan, who had both voted against allowing VISs in previous cases.
Since then, VISs have become regular features in criminal sentencing, but for the social justice advocates who championed them, the gains have been mixed. Crime victims are now updated about the status of relevant criminal cases and have the chance to be heard at sentencing, but research has shown that these interventions only address a small fraction of survivors’ needs. For most survivors, the overarching desire for participation in criminal proceedings is to have their pain acknowledged publicly, and to be supported by the community. Participation in criminal trials can also be retraumatizing and can create a further sense of loss of agency. Aside from a small number of courts that incorporate restorative justice practices like victim-offender mediation or restorative conferencing, very few courts offer trauma-informed interventions that can address victims’ serious, non-retributive needs.
But for law-and-order types, VISs have succeeded in pushing for harsher penalties. Studies have shown that judges are often deeply influenced by emotional narratives and their personal feelings about litigants appearing before them. Researchers have also demonstrated that powerful VISs can lead to longer sentences, particularly when the speaker is perceived as sympathetic.
And now, AI can simulate those sympathetic voices with uncanny precision, as Judge Lang acknowledged after hearing from the AI version of Chris Pelkey. “I feel like calling him Christopher as we’ve gotten to know him today,” Lang said. “I feel that that was genuine.”
Many victims don’t seek vengeance. What they want is to repair the harm, to be heard, and to find a way forward. But our current legal system offers them only one pathway: the punitive route. If victims want to be acknowledged, they must do so within a framework that demands longer prison terms as a form of respect.
We must be wary of letting emotional performance, especially technologically enhanced ones, dictate the fate of those who stand trial. As AI continues to evolve, so too must our approach to justice. The answer is not more punishment disguised as empathy, but a system that centers the needs of victims through practices rooted in humanity, not holograms.