LOS ANGELES (AP) — For hours, motion-capture sensors strapped to Noshir Dalal's body tracked his movements as he unleashed aerial attacks, overhead strikes and, later, one-handed attacks that would appear in video games. He swung a sledgehammer so many times that he eventually ruptured a tendon in his forearm. By the end of the day, he couldn't even pull the door handle of his car open.
The physical demands of this kind of motion work, and the time it takes, are part of the reason he believes all video game performers should be equally protected from the use of unregulated artificial intelligence.
Video game performers worry that AI technology could replicate one performance into another without their consent, reducing or eliminating their job opportunities — concerns that prompted the Screen Actors Guild and the American Federation of Television and Radio Entertainers to strike in late July.
“If motion capture actors, and video game actors in general, can only make what they make that day, that's a really dangerous path to walk,” says Dalal, who played Bode Akhna in Star Wars Jedi: Survivor. “Instead of saying, 'Hey, I'm going to bring you back to life,' they're not going to bring me back to life at all, and they're not even telling me that they're doing it. That's why transparency and compensation are so important to us in protecting AI.”
Hollywood video game performers have announced their second strike in a decade after more than 18 months of negotiations with the gaming industry's biggest companies over a new interactive media contract broke down over protections for artificial intelligence. Union members say they are not against AI, but performers worry the technology could give studios a way to take away their jobs.
Dalal said he took it personally when he heard that video game companies, as they were negotiating new contracts with SAG-AFTRA, wanted to consider “data” of some movements, rather than performance.
If gamers add up all the cutscenes they watch in a game and compare that to the time they spend controlling characters and interacting with non-player characters, they'll find they're “exposed to the work of movers and stuntmen far more than they are to my work,” Dalal said.
“They're the ones selling the world of doing combos, doing crazy super cool moves using Force powers, playing as Master Chief, flying around the city as Spider-Man,” he said.
Some actors argue that AI could deny inexperienced actors the chance to land small background roles, such as non-player characters, which they usually gain experience in before moving on to bigger roles. Unrestricted use of AI could also lead to ethical issues if their voices or likenesses are used to create content they morally disagree with, actors say. This kind of ethical dilemma has recently come up with game “mods,” where fans modify game content to create new ones. Last year, voice actors spoke out against mods for the role-playing game Skyrim, which used AI to generate actors' performances and replicate their voices for pornographic content.
In video game motion capture, actors wear special Lycra or neoprene suits with markers attached to them. The actors perform basic movements like walking, running, and grabbing objects, in addition to more complex interactions. Animators take these motion capture recordings and stitch them together to react to the actions of the person playing the game.
“AI is enabling game developers and game studios to automatically generate large amounts of animation from past recordings,” says Brian Smith, an assistant professor in Columbia University's Computer Science Department. “Studios no longer need to collect new recordings for each type of game or animation they want to create. They can also use their archives of past animation.”
He said the studio has accumulated motion capture from previous games, and if it wants to create a new character, animators can use those saved recordings as training data.
“With generative AI, you can generate new data based on patterns in past data,” he said.
Audrey Couling, a spokeswoman for video game companies, said the studios had presented “meaningful” AI protections, but SAG-AFTRA's negotiating committee said the studios' definition of who constitutes a “performer” will be key to understanding who will be protected.
“We have worked hard to offer reasonable terms that protect the rights of performers, allowing us to continue using cutting edge technology and provide a great gaming experience for fans,” Couling said. “We have proposed terms that provide consent and fair compensation to everyone employed under the (contract) if an AI facsimile or digital replica of their performance is used in the game.”
He said the game company has proposed wage increases, initially a 7% scaled increase and then a 7.64% increase starting in November, which amounts to a 14.5% increase over the life of the contract. The studio has also agreed to higher per diems, accommodation and travel allowances, as well as increased overtime pay and bonuses, he added.
“Our goal is to reach an agreement with the union and end this strike,” Cooling said.
A report on the global games market in 2023 from industry research firm Newzoo predicts that video games will include more AI-generated voice acting, such as the voice acting in Squanti Games' “High on Life.” Game developers will use AI to create their own voices, eliminating the need to find voice actors, according to the Amsterdam-based firm.
“Opportunities for voice acting may decrease in the future, especially as game developers use AI to reduce development costs and time,” the report said, noting that “classic AAA games like The Last of Us and God of War use motion capture and voice acting, similar to Hollywood.”
Other games, such as “Cyberpunk 2077,” feature celebrities.
Actor Ben Prendergast said the data points collected through motion capture don't capture the “essence” of what it means to perform as an actor. The same can be said for AI-generated voice, which can't express the subtle choices you make in a big scene or the smaller, more painstaking tasks, like screaming for 20 seconds as a character dies in a fire.
“The big problem is that someone somewhere has this wealth of data and I have no control over it,” said Prendergast, who voices Fuse in the “Apex Legends” game. “Someone, malicious or otherwise, could get hold of that data right now and say we need a character who's nine feet tall and looks like Ben Prendergast and can fight in this fight scene. And I have no idea that that's happening until the game launches.”
Unless SAG-AFTRA can secure the AI protections they seek, studios “can get away with it,” he said.
“It reminds me a lot of sampling in the '80s, '90s and 2000s, when a lot of people were sampling classic songs,” he says. “It's an art form. If we don't protect people's right to their image, their voice, their body and their walk, we can't protect them from other things.”