Rhona Wise/AFP via Getty Images
On Feb. 14, 2018, Joaquin Oliver started another day as a senior at Marjory Stoneman Douglas High School in Parkland, Fla. By the end, he was one of 17 people murdered at the school in a mass shooting that sparked a worldwide, youth-led movement on gun violence.
Now, people can hear his voice again.
“It’s been six years, and you’ve done nothing,” says a voice that resembles Oliver’s. “Not a thing to stop all the shootings that have continued to happen since.”
Oliver’s audio is one of six messages generated by artificial intelligence meant to resemble the voices of individuals killed by guns in different incidents over the past decade. It’s part of an initiative led by March For Our Lives, the gun control organization borne out of the Parkland shooting, and Change The Ref, a group started by Oliver’s parents, vocal advocates Manny and Patricia Oliver.
The messages will appear on the Shotline, an online platform that the groups created, where users can individually send the AI-generated audio directly to the offices of members of Congress, demanding further action on gun violence prevention.
“I’m back today because my parents used AI to recreate my voice to call you,” Oliver’s message continued. “Other victims like me will be calling too, again and again, to demand action. How many calls will it take for you to care? How many dead voices will you hear before you finally listen? Every day your inaction creates more voices. If you fail to act now, we’ll find somebody who will.”
The decision by Change The Ref and March For Our Lives to use AI is a striking move for some AI experts watching how the controversial technology is being used in political spaces. The statement is also a notable shift in tone compared to how March For Our Lives has commemorated the anniversary over the past half-decade.
“We have to interrupt people’s regularly scheduled programming as a movement to get their attention,” said David Hogg, the co-founder of March for Our Lives and a survivor of the Parkland shooting.
“And we have to use all the tools that we can at our disposal in an ethical way, of course, to get their attention in the first place. And if that means using AI to simulate the voices of people that have been stolen by gun violence, then so be it,” he said.
Hogg explained that typically, on the anniversary of the shooting, March For Our Lives tries to respect the wishes of parents from Parkland, and working with Change The Ref is part of that.
Saul Martinez/Getty Images
Change The Ref has used AI to replicate Oliver’s voice in the past – in 2020, his parents worked to create a video of their late son speaking about the importance of voting. They’ve also recently come out with a campaign using AI-generated images of Republican leaders as young children in school shootings.
But the launch of the Shotline comes as questions over the ethical use of AI continue to pop up in politics.
A new frontier in politics
Last week, the Federal Communications Commission announced that robocalls using AI-generated voices violated telecommunications law after an AI-generated robocall sounding like President Biden was circulated to New Hampshire voters ahead of the primary election, telling them to stay home.
When assessing the Shotline, some AI experts cautiously see this as ethically above board, given the messages aren’t attempting to mislead anyone.
“I’m not saying this [initiative] isn’t complicated and we should talk and have a serious conversation about the ethics of it. But I would say this is not a negative use case,” said Hany Farid, a Professor at the University of California Berkeley who specializes in digital forensics and detecting disinformation.
The organizers behind Change The Ref worked with the victims’ families on the project, and each consented to their child’s voice being used. Plus, each message being sent to Congressional offices states it is AI-generated.
“I think as long as there is disclosure about it, as long as they’re not trying to be deceptive, which they clearly are not,” he added, “I think it’s both powerful and I think it shows… an effective and non-nefarious use case of generative AI.”
Irene Solaiman, the head of global policy at the AI company Hugging Face, was moved by the voices highlighted in the Shotline. She told NPR that AI use in advocacy can be a powerful tool when used respectively by individuals affected by it. However, as she continues to wrestle with what the future of the appropriate use of AI looks like, questions still arise.
“There is a danger to generating representations of people who have lost their lives where the authority to control that representation may not only rest among the loved ones,” Solaiman said. “There’s no real delineation of who those loved ones are, who are the appropriate people to control the representation, and whether that control should lie in the developer, a company or the people who are distributing the voice or generated content?”
Leigh Vogel/Getty Images for March For Our Lives
Gun control advocates look beyond
The complicated nature of the issue is not lost on the gun control advocates as they enter the AI space. But to organizers behind March For Our Lives and Charge The Ref, the campaign centers on the need to hear from victims themselves.
“We talk too much about statistics and not enough about people a lot of the time. And it’s not for a bad reason. It’s just because we care a lot about how we can end this,” Hogg said.
According to the Shotline’s page, 656 mass shootings occurred in 2023, resulting in more than 43,000 deaths, which aligns with data from the Gun Violence Archive.
“But unfortunately, statistics don’t change people’s minds,” Hogg added. “Stories do, and people do.”