4 ways games are influencing the world of cinema

קולנוע

In 1982, the movie Tron was the pinnacle of digital storytelling.

It introduced audiences to an amazing new type of cinematic effects that was unlike anything else shown in theaters or on television.

But perhaps even more importantly for modern filmmakers, the creators of Tron pioneered computer animation technology, which paved the way for many innovative visual effects tools.

The spectacular visual style of the movie "Tron" was actually influenced by one of the earliest and simplest commercial video games - "Pong“.

For the audience and the filmmakers alike, "Tron" was similar to a video game, but in a completely new context. It was projected on a much larger screen, the graphics were breathtaking, and the film told a sophisticated story.

At the time, computer game makers could only envy the advanced visual effects and storytelling tools available to their film counterparts. Therefore, the use of computer game-style graphics and technology in a feature film was innovative and exciting.

Jump forward almost 40 years, and the wheel has turned.

Video games have become a narrative medium unto themselves. Modern games boast incredibly realistic graphics, complex open worlds and deep and nuanced narratives that can compete with the best Hollywood productions.

While "Tron" once presented new ways of imagining a computer game, computer games today are the ones that drive technical and creative innovation in the field of filmmaking.

So, let's take a look at 4 tools and trends from the video game industry that have a significant impact on how filmmakers work and stay creative.

The digital cinema is here!

Learn how to create amazing scenes in Enril Engin - the leading tool for visual productions

סט צילומים

The gaming industry is also changing the game in cinema

Before we dive into the meaning, let's put things in proportion.

NTurn to 2020, industry Computer games are worth 180 billion dollars, while the global film industry sums up About 100 billion dollars. This means that the gaming industry is not only bigger than film, but also invests significantly in the research and development of innovative technologies - tools built for games, but influencing the wider worlds of content, including television and film.

Why is it even important?

Because this massive investment allows the gaming industry to research and develop innovative tools to tell stories on a scale unmatched by the world of cinema. Take for example the game  (Fortnite), which runs on the powerful game engine Unreal Engine Developed by a company Epic Games.

With it, Fortnite presents huge 3D environments full of action in real time.

But Fortnite is not just a video game - it is also a virtual photo studio. In replay mode, the game offers precise control of virtual cameras that can dive deep into the action or watch it from above, capturing every victory or defeat in incredible detail.

And another interesting point about Fortnite: this game is a real money machine. Even though it's completely free, it brought in $9 billion in its first two years just through in-game purchases. And that's actually good news for filmmakers.

Why? Because the enormous financial success of Fortnite allows the company Epic Games invest enormous resources in the development of Unreal Engine.

And that's great news for the film industry, because Unreal has evolved into a very powerful tool for modern video production workflows.

So how does the economic and technological power of the gaming industry affect cinema?

Stay with us to find out…

Always dreamed of making your own movie? Now it is possible!

Join a development course at Enril Engin and learn how to combine cinema and technology

1. A new stage for amazing effects

In recent years, many pioneers in the field of high-level animation and visual effects have abandoned traditional 3D software such as Autodesk Maya and-3ds Max, and moved to work with Unreal Engine.

The Mandalorian series jumped one step further and applied Unreal Engine to "in-camera VFX". This meant that Industrial Light & Magic (ILM) used the Unreal Engine to create computer graphic (CG) background environments in real time on huge LED walls that surrounded the studio.

This allows shots to have stunning CG environments with almost zero post-production processing.

Epic Games does not stop here - the company is actively working to encourage the film industry to use its game engine. It is now adding new features, developed specifically for filmmakers, to make Unreal Engine an even more attractive tool for the film industry.

for example – the pluginVirtual Camera Allows you to use the iPad Pro as a viewfinder for an imaginary camera. The position and orientation of the device follows the user's movements in space and is transferred directly into the 3D scene.

"Epic is taking the success of Fortnite and pouring the money back into the game engine",

says Jim Goldick, cinematographer and technology expert, senior vice president of virtual production at Dimension North America, an arm of London-based Dimension Studio.

"Filmmakers are looking at these toolkits and thinking, ``I used to need a crew to do this, but now I can do my blocking directly in the game engine.''

Of course, Unreal isn't the only player in the field. Engine Unity Offers many of the capabilities of Unreal. Similar to Unreal, Unity is also used in a wide variety of productions.

Additional game-like development environments, such as Notch and-TouchDesigner of Derivative, also help to bridge the real and digital world. These tools allowReal-time set mapping"For a live and fascinating production, like the nine-part mini-series"Behind the Scenes of The Boys on Amazon.

The team was able to track multiple live cameras with virtual 3D backdrops, and later added elements of augmented reality in real time. The final result, directly in the camera, saved resources and time compared to traditional visual effects post-production tools.

andtechnology This is spreading like wildfire Also in feature series, incl Station 19, 1899, the new season of Star Trek: Discovery and the comedy directed by Taika Waititi Our Flag Means Death.

Almost without the need for post-processing.

To put it another way, game engines are quickly becoming one of the most important tools in filmmaking, enabling creative innovation and improving workflow efficiency.

2. Graphics processors are changing the world of cinema

So why are filmmakers only now discovering the benefits of game engines?

The Unreal engine was released in 1998, and filmmakers have been experimenting with all kinds of virtual real-time production since at least 2006. This is when Weta Digital developed the technology The virtual cameras that were used in the movie "Avatar" by James Cameron.

So why is it only in recent years that virtual production techniques have really gained momentum in Hollywood?

The answer is the GPU (graphics processing unit).

It is true that Weta Studios managed to produce enough polygons for James Cameron to block scenes in a simple virtual environment. But what ILM did in "The Mandalorian" in real time is a completely different story, and requires software and hardware Much more advanced.

Fortunately, the days of producing legendary visual effects in a dark basement are over. Today, thanks to the growing demand for video games, high-end graphics rendering is more available than ever before!

With advanced cloud solutions, filmmakers can run powerful GPU servers in the cloud as needed, all without investing in physical infrastructure.

Conversely, for creators who need to edit videos in 4K quality (and higherand let go) or manage processes Homemade visual effects, there is a wide variety of powerful graphic processors that cost only a fraction of the cost of the cameras we shoot with.

To understand the power of modern graphics cards, let's take a look at the workflow behind Pixar's first movie, Toy Story.

Pixar used a group of 117 Sun workstations (87 of them dual-processor, 30 of them quad-processor) to create the film. But with mid-1990s technology, more was required From 800,000 machine hours to process the final version of "Toy Story".

Marc Van de Watering, who was a software engineer at Pixar when the film was made, says that today's video cards were Turn the "toy of a story" for children's play. "It's quite clear that modern video games have long surpassed what we could achieve in 1995", He says.

Obviously, since Toy Story, we've come a long, long way when it comes to computer graphics and animation. Today's standards are significantly higher than what was possible in the 1990s.

However, the truly amazing thing is that today's video cards are more powerful than all the computing systems that worked on Hollywood's biggest 3D animated films at the time.

If those teams had access to modern graphics processors that allow them to render and modify ideas quickly, just think how many more animated masterpieces could have been created with the same resources and talent.

The luck is that you don't have to imagine it anymore. We live in an age where film crews control almost unimaginable computational resources. The performance levels of computers that used to be the pinnacle of supercomputer technology are now found in every game console, at a really ridiculous price.

Graphics processing units (GPUs) have become an essential component in many industries, but especially in cinema.

From the early vision to the action on the screen, the graphics processors break boundaries and allow us to tell stories with levels of efficiency and creativity that we have not known.

Forget the way we used to make movies!
With this amazing technology, creators must rethink - it is no longer an actor on a set but a breathtaking graphic experience.

Cinematic productions in Enril Engin

Find out how to turn an idea into a movie with the most advanced tools

3. Worlds merge

The technologies that revolutionized the world of video games, volumetric capture and photogrammetry, are now leading to a revolution in the world of cinema as well, completely changing the rules of the game.

Unlike regular photos or videos that only capture a 2D image, these technologies use cameras with depth sensors to create 3D models of objects or scenes. These models can then be used in a virtual environment or in post-production.

If this sounds like science fiction, you might be surprised to learn that my brain is 3D photographs (photogrammetry) are already changing work procedures in the real world.

In the age of VFX, simply taking beautiful pictures is no longer enough!
For virtual actors to live alongside the real actors on set, effects teams need much more accurate information.

Forget the classic way of making movies!
In the new world, much of the work begins before even a single camera starts shooting. This means that the process is no longer linear, and technologies such as volumetric perception and 3D photography help us face these new challenges.

You don't only see it in movies!
The world of sports broadcasting is one of the first to adopt these advanced technologies. In recent years, they have been equipping the stadiums with special cameras that can photograph the entire field in 3D, as if to "photograph it from the inside" and create 3D images accurate

The footage of the game is transmitted live through an optical cable to powerful servers that analyze the images and convert them into volumetric data. Production teams can use this data to produce 360-degree replays, virtual camera angles, or even a player's perspective of the game itself.

It's not like in the games! Unlike game engines that render 3D in real time, volumetric perception technology takes a bit longer. Even with powerful computers, Intel states Creating one frame in 3D takes about 30 seconds.

Although volumetric perception does not always produce photorealistic results, the unique and slightly "stuck" look of the products in volumetric perception attracts some filmmakers.

In 2016, a production studio Scatter From New York, together with director Alex Gibney, created something innovative!
 In "Zero Days VR" they used special 3D video technology to portray an NSA whistleblower as a floating head figure that changes shape between a holographic figure and a completely realistic face.

For filmmakers interested in experimenting with volumetric perception on a small scale, Scatter sells Depthkit software that works with devices like Microsoft's Azure Kinect and Intel's RealSense series of depth cameras.

Director and screenwriter Neil Blokamp (“District 9,” “Elysium”) has embraced volumetric perception technology on a larger scale. His film "Demonic" includes over 15 minutes of footage in a volumetric concept.

For filming in the scenes where the main character in the movie "Demonic" investigates a brain simulation of her comatose mother, the actors were filmed inside a volume capture facility with 260 cameras. These volume shots were integrated into 3D environments using the Unity engine and a new technology called Project Inplay, designed for real-time playback, rendering, and even dynamic lighting of large volume point clouds.

Have you seen the trailer? A peek into the brain is not for the faint of heart!

Instead of perfect photorealistic images, the film uses special effects that look more like nightmare – Just what you need for a scary horror movie!

By the way, Blokamp borrowed one more crucial element from video games: he commissioned a soundtrack from Ole Strahand, a composer best known for his electronic music for the Tom Clancy's The Division video game series.

Flashy projects that make use of volumetric perception and 3D photography technologies intensify the demand for these tools, and attract the attention of huge companies.

This means greater investment in these technologies, with many technology companies trying to take advantage of the demand for volumetric perception and profit from the interest in virtual production.

An example of this is Dimension Studio, Which partnered with Avatar Studios and Sabey Data Centers in the establishment "Avatar Dimension", a new volumetric concept studio located in Ashburn, Virginia. The studio plans to specialize in **"corporate virtual experiences"**.

It is just one of five Microsoft-certified multi-dimensional recording studios currently under construction.

Although volumetric perception and 3D photography are sometimes used in special and expensive projects intended for large companies, these technologies are not limited to these high levels. Productions of any size can already start enjoying these new tools.

A great example is the Object Capture app that Apple introduced at WWDC 2021.
This new API will allow developers to turn virtually any Mac or iPhone into a 3D photography tool using stereo photography (photogrammetry).

As the computing power and price of graphics processors (GPUs) continue to grow, it can be expected that development trends like these will continue.

And this will bring about a significant change in the way we approach production and post-production.

4. New way of playing

One of the effects of the new technologies is evident in the way the preparation teams work.

Many think that virtual production is still limited to LED stages and complex volume capture facilities.
But as a tool for effective early stage preparations, previews and technical forecasts, it can work on almost any budget.

For example, meet CineTracer, a real-time shooting simulator.
This $90 app uses an engine Unreal Engine To allow you to work on scene lighting, shot blocking and board stories, all within what is essentially a video game. And you can buy it on the Steam game market.

CineTracer founder Matt Workman believes these types of tools will be the bridge for many to virtual production. Workflows will evolve to include these tools, and down the road, as more affordable LED stages open up in major photography centers, we'll start to see ready-to-use services offered to filmmakers. This will enable an easier transition to larger-scale virtual production tools for many teams.

"You wouldn't work on a big-budget TV series like that," he says. "But if it's a situation where you have to be on the moon, for example, you can get computer graphic backgrounds at an affordable price to get to one of these stages and shoot there. It's not like you build your own photo studio to make a commercial. You rent it. Virtual production will be…”

"If you're a small team and you want to make a movie using these technologies, you can do it without going on an LED stage. Use it for all your VFX, shot blocking, storyboards and previs, then go out and shoot traditionally.”

You don't even have to be sitting at your workstation to use Ghostwheel's Previs Pro, an iOS app that creates virtual camera storyboards, lights, characters and scenery in 3D environments. It even has an augmented reality mode that helps you imagine your scene in real space.

Certainly, these powerful and affordable tools open up a whole new range of possibilities for film and video professionals. However, some of the innovations that have come from the gaming world are a little more "transparent" and are below the surface.

Upgrade your productions with innovative technology!

Join the course at Enril Engin and learn how to combine cinematic visualization with game development

The next step in the world of cinema

In the Gothenburg Film Festival's 2021 "Nostradamus" report, media analyst Johanna Koljon predicts that virtual production will be the norm in the industry by 2026.

The thought of producing a fully 3D animated series without a post-production component was once just a fantasy. Today, it seems almost possible, thanks in part to the booming world of gaming. The demand for games is only increasing, which will require more investment, new tools and more creativity.

So, moviegoers, you better pay attention.

One major development is Epic's Unreal Engine 5. It brings new features that will make real-time virtual production workflows better, faster and cheaper.
And this is just one of several exciting tools to come.

As we mentioned earlier, Unreal Engine is gaining increasing momentum in the world of cinema. More and more creators use this graphic engine.

Recently, the Disney company invested a huge amount of One and a half billion dollars At Epic Games. This is a significant investment, indicating Disney's belief in the potential of Unreal Engine for the film industry.

Disney's investment is expected to strengthen the relationship between the film industry and the game industry, and lead to closer cooperation between the two companies. We may see more widespread use of the Unreal Engine in Disney films, as well as collaboration in the development of new technologies.

And the best news: although this technology takes time and practice to master, there is no problem with its accessibility. Unlike most professional 3D software, Unreal Engine and Unity are free to download for individual users.

If you want to enter the world of computer graphics (CG) and virtual production technologies, there is no need to wait. Just download the software and start exploring!

AThere is no doubt - the rate of innovation is only increasing. With so much powerful technology at their fingertips, directors have every incentive to raise the standard of their production.

And it's exciting for the future of digital storytelling, whether it's a movie, series, or game.

Ready to join the game?

The influence of games on the world of cinema is growing!!

Technologies developed for games find their way into film productions. Unreal Engine is a popular game engine that enables the creation of realistic visual effects, virtual environments and the development of new technologies.

Used in many Hollywood movies, such as "Matrix Resurrection", "Star Wars: The Rise of Skywalker", "Avengers: Endgame" and more.

Want to learn more about Unreal Engine 5 and integrate into the world of innovative cinema?

Join our curriculum!

In our courses you will learn:

  • Unreal Engine 5 Basics
  • Creating visual effects
  • Building virtual environments
  • Game programming and development
  • And more!

Contact Us

0585700410 - Osher Dvir

WhatsApp

cell phone

Leave details and we will get back to you