Technology across the world progresses at an exponential rate. With Moore’s law — ostensibly referring to the number of transistors in an integrated circuit — observing that the power of technology doubles every two years, rapid increases in innovations affect everything from computer processing speeds to surgical procedures to the way cities are run. And these innovations can change everything regarding how we interact with the world. One only needs to look at the rise of the smartphone — non-existent 15 years ago — to see how new technologies can quickly change the status quo of how life is conventionally lived.
The same principle applies to film. Whether its the rise of de-aging technology or streaming services, more sophisticated tracking rigs, more realistic CGI, more powerful cameras or the rise of franchise storytelling, what counts as a normal part of the cinema-going experience rapidly changes from year to year. Below we have thought of five innovations that will make cinema-going a radically different experience by 2030.
Convincing De-Aging Technology
So far de-aging technology has been a little hit and miss. While Robert De Niro puts in a stellar performance as a younger man in The Irishman and Samuel L. Jackson has a lot of fun revisiting his ’90s era in Captain Marvel, convincingly making actors appear over thirty years younger seems difficult to master. Nonetheless, the technology is quickly evolving, moving from one-off events — such as the quick flash of a younger Princess Leia in Solo: A Star Wars Story — to sustaining entire movies, as demonstrated by Samuel L. Jackson in Captain Marvel.
Additionally, due to its exorbitant cost, de-aging technology is solely the province of big studio efforts. Nonetheless, like computer generated imagery, which can now be experimented with in indie efforts, the technology will progressively get cheaper and cheaper, making it more available to independent studios. Therefore, we can expect to see de-aging technology become more commonplace while becoming even more convincing in the process.
To see the future of cinema, it often helps to see what James Cameron is working on. Avatar, released all the way back in 2008, was a breakthrough when it came to 3D projection, truly immersing viewers into the unique world of Pandora. The famed auteur is still working on a sequel, truly taking his time to make sure that the technology is sophisticated enough to carry the story that he wants to tell.
One innovation we should be able to expect here is even more advanced 3D technology. He told a crowd in 2018 at the Vivid Light Festival in Sydney “That includes collaborating with the people at Dolby Cinema, who have developed high dynamic range projection that could put 16 foot-lamberts of light on a 3D screen through the glasses, which is revolutionary. Normally, you’re looking at about three foot-lamberts.” He has even teased the potential in later sequels for pesky 3D glasses to be done away with entirely. For people like myself, who have to wear the 3D glasses over their own glasses at the cinema, this could be as convenient as it is immersive.
While holographic performances have been explored in science-fiction for some time now, most hauntingly in Blade Runner 2049, they will soon become an actual reality. Just last year none other than James Dean — who died all the way back in 1955 — was cast as the lead of Finding Jack, with the film’s directors telling The Hollywood Reporter that after searching “high and low for the perfect character… we decided on James Dean”.
Despite widespread horror and objections, the iconic actor will be resurrected through archival footage and photos while another actor will voice him, setting a scary precedent for the future of films to come. This means that while actors may die in spirit, their image will live on in perpetuity. Expect the likes of Marilyn Monroe, Audrey Hepburn, Humphrey Bogart and Steve McQueen among many others to potentially follow.
More Accessible Motion Control Cameras and Rigs
It used to be that complex tracking shots or having the same actor in multiple places within the same frame was something that could only really happen in a big budget production. Yet with cameras becoming exponentially cheaper and more powerful, indie filmmakers are now able to set up and execute these complex camera shots themselves.
Automatically controlled digital cameras and rigs use motion control to develop sophisticated shots without the need for human control. The first systems to do such a thing were built in the late ’70s for Star Wars, so this technology has been around for a while. What will change in the ’20s is the accessibility of such technology. Recently DIY inventor Howard Matthews managed to make his own DIY motion capture rig. Expect indie and low-budget productions to be even slicker and more impressive in the years to come.
Data is everything now. The amount of data available doubles every two years. As a result, business decisions are heavily based on crunching the numbers and feeding them into an algorithm. This affects the film world too: Warner Bros. signed a deal with AI-firm Cinelytic to guide everything from whether they should hire a certain star to how much money a movie could make at the box office.
This apparently can be helpful during the fast-paced world of film festivals, with distributors able to analyse the potential of a movie quickly through their platform, potentially saving them from making a costly decision. According to Cinelytic founder Tobias Quisser, “The system can calculate in seconds what used to take days to assess by a human when it comes to general film package evaluation or a star’s worth”. Nonetheless, it is likely that it will be an aid to creativity and not a replacement itself: after all, there is no one formula to creating the perfect film.