Former CNN White House correspondent Jim Acosta has ignited debate after conducting an interview with an AI-generated avatar of Joaquin Oliver, a victim of the 2018 Parkland school shooting. In the video, Acosta speaks to a digital version of Joaquin—animated from a real photo and powered by generative AI—asking him, "What happened to you?" The avatar delivers a brief, robotic account of his death due to gun violence, with jerky movements and a flat, computerized voice, per the Guardian.
"I was taken from this world too soon due to gun violence while at school," the AI teen says. "It's important to talk about these issues so we can create a safer future for everyone." Joaquin, who was 17 when he died, would have turned 25 this week. The AI version was created by Joaquin's parents, who invited Acosta—now an independent journalist producing content on Substack—to be the first reporter to conduct an interview with the avatar. Acosta described the experience as "a beautiful thing," while Joaquin's father expressed that hearing his son's voice again, even through AI, was a blessing and something that sparked hope for future technological uses.
The segment quickly drew criticism online, per HuffPost. Detractors argued that Acosta could've chosen to speak with living survivors, rather than a fabricated digital re-creation, raising concerns about authenticity and ethics. "Hey Jim. Quick question. What the f--- is wrong with you," one disgusted commenter noted on Bluesky. Another take: "This is unconscionable, ghoulish, and manipulative. How dehumanized do you have to be to think this was a good idea?"
story continues below
The use of AI to simulate deceased individuals from the Parkland shooting has been employed before: Last year, families of Parkland victims launched an AI-powered robocall campaign called The Shotline, using re-created voices to urge lawmakers toward gun reform. The rise of AI avatars capable of simulating real people—especially those who've died—has generated controversy. Critics warn that such technology blurs the line between truth and fiction, opening doors to scams, misinformation, and deepfakes.