In what’s believed to be a world first, artificial intelligence (AI) has allowed a slain man to address his killer at the sentencing hearing.
Christopher Pelkey was shot dead in a road rage incident in Chandler, Arizona, four years ago, but just recently, AI was used to recreate a digital version of the victim that was allowed to make a statement during court proceedings, a local news site reported.
The video presentation also included real clips of Pelkey to give those in court a clearer understanding of his personality. Some of these clips were also used to create the AI-generated likeness of Pelkey, which you can see below.
In the video played in court, the AI version of Pelkey says: “To Gabriel Horcasitas, the man who shot me — it is a shame we encountered each other that day in those circumstances.”
He continues: “I’m a version of Chris Pelkey recreated through AI that uses my picture and my voice profile. In another life, we probably could’ve been friends. I believe in forgiveness and in God who forgives. I always have and I still do.”
After watching the video, Judge Todd Lang said: “I love that AI. Thank you for that. I felt like that was genuine, that his obvious forgiveness of Mr. Horcasitas reflects the character I heard about today.”
The judge then sentenced Horcasitas to ten-and-a-half years for Pelkey’s manslaughter.
It was Chris Pelkey’s sister, Stacey, who came up with the idea to use AI to create a likeness of her brother for use in court. She said it was important “not to make Chris say what I was feeling, and to detach and let him speak because he said things that would never come out of my mouth, but that I know would come out of his.”
Ann A. Scott Timmer, Chief Justice of the Arizona Supreme Court, commented that AI has the potential “to create great efficiencies in the justice system and may assist those unschooled in the law to better present their positions. For that reason, we are excited about AI’s potential.”
Timmer added: “ But AI can also hinder or even upend justice if inappropriately used. A measured approach is best. Along those lines, the court has formed an AI committee to examine AI use and make recommendations for how best to use it … Those who use AI — including courts — are responsible for its accuracy.”
Indeed, while the use of AI in this way brings a powerful and deeply personal element to court proceedings, it also raises various ethical and legal concerns about authenticity, emotional influence, and appropriate application. As a result, it seems likely that other courts will at some point develop guidelines for future cases, if they choose to allow AI-generated victim statements.
Please enable Javascript to view this content