The 30-second movie trailer has all the hallmarks of a Hollywood epic: a lonely hero emerging from a space station and exploring a barren planet, stumbling across a UFO. The work of hundreds of specialist visual effects artists and crew? In reality, not one frame was made by a human.
The trailer was created by Sora, a new artificial intelligence (AI) model that can create realistic and complex videos from just a text prompt. Launched in February 2024, it is the latest release from OpenAI, the owners of heavyweight chatbot ChatGPT.
This new technology is causing shockwaves across the film industry. In the United States, prolific producer Tyler Perry (the Madea film franchise, A Fall from Grace) paused his $800 million studio expansion in reaction to the software. Peter James, co-chair of the Australian Cinematographers Society’s (ACS) AI Committee, says the exponential growth of AI technology could threaten film jobs locally and internationally.
James has worked as director of photography on over 30 films including 2018’s Ladies in Black, which employed over 250 Australian cast and crew. He says experienced camera workers are worried their work will no longer be able to support their families.
“I’ve got people asking me, ‘Should I get a lawnmowing franchise?’,” he says.
According to 2021 –2022 employment statistics from the Australian Bureau of Statistics, Australia’s film industry employs over 30,000 workers – 7761 in Victoria – in the production and post-production sectors and added nearly $2 billion to the Australian economy.
However, the economic success comes at a high price for some who work in Australia’s largest creative industry. A 2022 report commissioned by the ACS revealed cinematographers and camera workers face chronic employment and income insecurity. Contracts for camera workers may be as short as one day, according to the report, and are compounded by informal hiring practices that those in the industry say normalise gender, age and racial discrimination.
The latest AI innovation adds another layer of precarity for these workers. A 2023 report from Goldman Sachs forecast that 26 per cent of entertainment industry jobs could be exposed to AI replacement.
Last year’s extended Hollywood strikes by writers and actors made the use of generative AI a key issue. That, however, has not stopped Hollywood inching towards AI adoption. Marvel Studios used generative AI to create its title sequence for its 2023 series, Secret Invasion. In late March, Sora was used to make a short film, Air Head, about a man with a balloon for a head. According to Bloomberg, OpenAI has already scheduled meetings with Hollywood executives to pitch the software.
It is not just the camera crew on large Hollywood productions that will be affected by video AI. Adam Camporeale, owner of two small production companies in Melbourne and Adelaide – Drive-Thru Pictures and Passel Media – says Sora will impact workers in the film industry and people will lose jobs.
While the software is currently only available for selected testers, Camporeale predicts, “we’re underestimating how quickly this will become something we can use and pay for.”
Cinematographer Peter James fears technology like Sora will stop camera workers from learning their craft, honing their skills working on a mix of small and big budget productions over a career.
“If you take away some of the smaller jobs, we’re going to end up with a skills shortage.”
Vahid Pooryousef, a PhD candidate in human computer interaction at Monash University, says video AI models are improving at extraordinary speed and Sora represents the biggest step yet in video AI.
The technology is a noteworthy leap from previous video AI models, like Meta’s Emu and Google’s Lumiere. Both were released only months ago and featured short videos with noticeable hallucination, poor physics, and low image quality. Sora made significant improvements by generating much longer and higher quality videos (up to one minute), with fewer noticeable errors.
After OpenAI released its technical report introducing Sora on 15 February 2024, CEO Sam Altman took to X (formerly Twitter) and used Sora to create videos based on user suggestions. Altman responded to a prompt asking for an instructional cooking session for homemade gnocchi hosted by a grandmother with a video that showed just that. The clip had more noticeable distortion and errors than Sora’s promotional videos, such as a gravity-defying rolling pin, but remained convincing.
Camporeale says, instead of producing “the highly creative stuff which we love”, his companies will have to focus on creating what AI technology cannot, such as authentic footage of the real world. He indicates wedding videos, event videography, or tourism videos, where customers want to see real events and locations, would be a likely focus for some production companies.
Stock image companies like Adobe Stock and Shutterstock – which has a commercial partnership with OpenAI – already sell generative AI-developed photos and graphics. Camporeale says that there will be a market for acquiring generic stock footage developed by AI. This could range from historical footage of the California gold rush to realistic drone footage of Italian churches.
Huge volumes of existing content are required to “teach” AI to perform a task, causing controversy about the rights of the original creators. OpenAI has remained tight-lipped about what data Sora was trained on, but James says that plagiarism and copyright are a major concern of his.
“Everything that is put into the AI programs is the work of someone else,” he says.
Video AI will have broad impacts beyond video workers, James predicts. Like generative AI imagery and deepfakes before it, video AI has troubling implications, including for citizens’ ability to distinguish facts from health or political misinformation. “It’s going to be a very distressing time for everybody,” James says.
He says that the Australian Government is lagging on AI regulation, and even though “the genie is out of the bottle”, AI needs to be managed to ensure voters’, consumers’, workers’ and artists’ rights are protected.
After consulting on “safe and responsible AI in Australia”, the Federal Government released an interim response in January that admitted:
“existing laws likely do not adequately prevent AI-facilitated harms before they occur, and more work is needed to ensure there is an adequate response to harms.”
On 26 March, the Senate established a Select Committee on Adopting Artificial Intelligence to “inquire into and report on the opportunities and impacts for Australia arising out of the uptake of AI technologies in Australia”. Chaired by Labor Senator Tony Sheldon and Greens Senator David Shoebridge, the committee is due to report to Parliament by 19 September 2024.