Will generative AI be used in VFX? We ask artists working in the industry
The spectre of AI in the VFX industry is growing, but is it really viable? CG tools and software, like Cinema 4D and Houdini, have always aimed to enable artists to do more in less time. With the maker of 3ds Max researching its own Autodesk AI tools it feels unavoidable that generative AI will be used in the VFX industry.
But how, when and why will AI be used for creating visual effects and 3D animation? As part of our AI Week I’ve put together a snapshot of what the industry thinks, by asking those working in it to for their views. To be clear, the experts and artists I’ve asked aren’t using AI in their daily work and the studios they work for aren’t using AI commercially. None of the images, shows or films in this article, or worked on by these artists, were created using AI.
Below you can read some thoughts from a mix of professionals working today in VFX, film and TV. You can also read my interview with Chaos Group’s chief product officer Kam Star, on plans to integrate AI into apps such as V-Ray, suggesting AI in VFX is inevitable, but how it’s used is the important topic of discussion.
AI and VFX: Dean Koonjul, Union VFX
DFX supervisor Dean Koonjul from Union VFX has just wrapped a talk on creating the incredible VFX and SFX for Oscar-winning movie Poor Things when we get chatting about AI and filmmaking behind the scenes at the recent The VFX Festival.
“I’m not really sure,” he says thoughtfully as I ask how he thinks AI could impact the film industry. “I think AI is on the cusp of being the most… of having the biggest impact,” says Dean, before adding how the combination of AI with Unreal Engine-powered virtual production could become “more accessible and more widespread within the visual effects industry”.
He jokes, “I wouldn’t be surprised if there are brand new technologies that started emerging within the next five years, the curve seems to be exponential”.
AI and VFX: AJ Sciutto, Magnopus
Magnopus created the virtual and physical sets and VFX for Amazon’s Fallout TV series, crafting huge LED Volume Stage scenery using Unreal Engine 5. Does director of virtual production AJ Sciutto see a place for AI in filmmaking? When it comes to the drudgery, possibly.
Daily design news, reviews, how-tos and more, as picked by the editors.
“If you click on something 20 times to accomplish a task, an AI tool that comes in to help solve that may require only five clicks instead of 20. Over time, over the development of an asset, that can help be a huge gain in efficiency,” explains AJ. “If using AI to help with creative tasks, what I’ve found is that it helps the artist work at a much higher level more efficiently and faster by removing the tedious start up process and allowing the artist to focus on creative output.”
AJ explains his team has been looking at, and experimenting with AI, but it’s not been used in production. That use case is framed by the need to make artists lives easier. He explains: “Some of the AI tools that we’ve been playing with in testing, to see if it’s worthy to add to a pipeline, has essentially been for the sole purpose of making the artist’s job more efficient, but nothing I’ve seen has replaced an artist yet in any of this pipeline.”
AJ offers as an example of when AI could be useful: when you have people in a room who need to visualise a scene, a moment in a film, and everyone needs to get on board early and fast. He suggests the production team can look over a script and scenes, use AI to quickly visualise ideas and get everyone on board early, and decisions are easier when there’s a visual to coral around and agree on.
He says: “AI can help to create at least a baseline where it’s like, ‘okay, we’re talking in the same ballpark’. Let’s now take that to the concept artist; the concept artist is still very much in the game, they still have to design the very specific elements that are required for that script that an AI just won’t pick up on. But they’re now going to have less iterations to do because everyone was already on the same page when that got kicked off to them.”
According to AJ there are limitations to AI that can’t match a human artist, such as the subtly of ideas in concepts and designs. He tells me: “It won’t be able to take the nuance from three pages of the script earlier, apply that detail to make that one reference, to that one object that should be in the corner, because it was referenced in the script two pages later; all that needs to still be done by humans.”
AI and VFX: Elliot Stammers, DNEG Animation
Animation is another sector where AI is making a big impression. At the recent The VFX Festival, I met up with Elliot Stammers, lead character TD at DNEG Animation for Paramount’s Under the Boardwalk (again, to clarify, not made with AI). He says, “I find it quite an exciting technology” but clarifies he’s “not an expert” but goes on to describe how just as 3D animation has – “sadly” – risen to dominate traditional animation, “[AI] is just a new tool, we just have to adapt to harness it, to use it effectively”.
He tells me, “The impression I get is a good job can be done pretty quickly with AI, but often our clients are interested in getting a very specific look and that is something that is very difficult to achieve with AI”.
Elliot tells me how he’s not too concerned about AI use because the level of control animators and directors need will “always require a human element,” he adds, “and I don’t see that changing any time soon”.
Specifically in animation Elliot explains how he was impressed with the use of AI in Spider-Man: Across the Spider-Verse to clean up the inbetweens on the traditional animation – “and that makes a lot of sense in this kind of guided application,” he says (note, this is not ‘generative’ AI).
AI and VFX: James Pollock, Lux Aeterna
Charged with researching how VFX studio Lux Aeterna can make use of AI and machine learning in its workflows, creative technologist and VFX artist James Pollock has been looking into what the future holds, but he says before any AI tools can be used commercially the industry needs to “tackle some of these bigger questions of ethics, the legal standards, and around using some of these tools”. (Read our article on the issues with AI.)
Where James sees a future for AI in VFX is where it can help “take a lot of the tedious and boring tasks out of the workflow”, and the tools that let “artists focus on art” will be the ones to succeed. In my visit to Lux Aeterna I discovered how some specific and ethical AI, such as for de-noising renders, has enabled the team to speed up its workflow.
The tricky aspect is the “balancing act” that comes as more generative AI is released, says James. This is “more complex” because it leads to questions around how to “maintain artistic intent and how [gen AI] affects peoples’ abilities to do a job”.
James tells me how the best uses of AI are the ones that integrate into an artist’s workflow but don’t replace an artist (a common theme is emerging). He explains: “So they’re opening up the same software they always open up to do those things, and now there’s something that has been powered-up by AI and is making the act of doing something, like rotoscoping, it’s making that a lot quicker”. (We can see this integrated use with AI tools now in Photoshop.)
The team has experimented with using Stable Diffusion, again not for commercial or client projects, and did so by running it locally on their own machines, building in different extensions and investigating ways to use it. “In a way, that felt like the prototype for what AI needs to be,” comments James. “It should be something where you can customise it and build around it in a way that makes sense for the projects that you’re working on.”
For now, James tells me, this kind of experimentation is where AI is, because until it “matures in an ethical and legal sense” it is useable in a commercial sense for a VFX studio like Lux Aeterna. There are concerns around copyrights, ownership and James reveals many streamers and studios won’t sign off on projects made using broad generative AI, so for now, it’s not being used. “And of course,” says James reflecting on Disney’s AI-created intro to Secret Invasion, “audiences just don’t like it”.
AI and VFX: Evan Biswanger, Vancouver Film School
While those working in VFX and filmmaking are struggling with the impact of AI, balancing the pros and cons of this new technology, those training the artists of the future also need to grapple with AI.
Evan Biswanger is director of creative, marketing and communications at the esteemed Vancouver Film School and explains how this college is tackling and not ignoring AI. He explains: “We approach the issue of AI the same way creators have always approached new tools that pop-up in the creative ring – including versions of AI that have been already used for years.”
He adds: “Art in all its forms is a delicate balance of a humans soulful expression. AI cannot replace that, only create a cheaper version of it. A cheaper version of art is nothing we at VFS are interested in. AI is a new tool with the possibility in the exploring / research process only.”
Many programs at VFS cover AI and its use if there’s a benefit to the student and the creative vision, says Evan. But he adds context, “Although the developments in AI in the past two years has been substantial, let’s not forget that versions of AI usage have been present for years”.
Evan explains: “At VFS we stay true to industry standards. Cutting corners or lessening the understanding of important knowledge and process, because there is an easier way, is not the answer. If the use of a new creative tech becomes industry-standard, we will be sure to continue to best prepare our students for the industry of their choosing.”
Creative Bloq’s AI Week is held in association with DistinctAI, creators of the new plugin VisionFX 2.0, which creates stunning AI art based on your own imagery – a great new addition to your creative process. Find out more on the DistinctAI website.