Jump to content

The Magic Behind the Screen: Celebrating the 96th Academy Awards Nominees for Best Visual Effects


Recommended Posts

The 96th Academy Awards nominees for Best Visual Effects are a testament to the incredible technological advancements pushing the boundaries of what’s possible in film.

Whether showcasing colossal destruction scenes, heart-pumping action sequences or interstellar adventures, each nominee demonstrates unique contributions in visual effects, or VFX — and they all used cutting-edge NVIDIA technologies in their workflows to bring their magic to the screen.

This year’s nominees include:

  • The Creator (20th Century Studios) — Jay Cooper, Ian Comley, Andrew Roberts and Neil Corbould
  • Godzilla: Minus One (Toho) — Takashi Yamazaki, Kiyoko Shibuya, Masaki Takahashi and Tatsuji Nojima
  • Guardians of the Galaxy Vol. 3 (Marvel Studios) — Stephane Ceretti, Alexis Wajsbrot, Guy Williams and Theo Bialek
  • Napoleon (Apple Original Films/Sony Pictures) — Charley Henley, Luc-Ewen Martin-Fenouillet, Simone Coco and Neil Corbould
  • Mission: Impossible – Dead Reckoning Part One (Paramount Pictures) — Alex Wuttke, Simone Coco, Jeff Sutherland and Neil Corbould

Reinventing the Monster Movie

Godzilla: Minus One presented a unique challenge: making a well-known giant monster, or kaijū, feel terrifying anew.

With a budget under $15 million, small by today’s standards, the film’s VFX team relied on rapid iterations with the director to eliminate long review cycles, along with a heavily detailed computer-generated imagery (CGI) model to bring Godzilla to life.

Godzilla was ready for its closeup, the monster’s head alone containing over 200 million polygons. The animators injected nuanced, lifelike behaviors into the creature to round out its performance.

In addition, the film’s destruction scenes used a sophisticated, memory-intensive physics engine, allowing for realistic simulations of crumbling buildings and landscapes under destruction to further immerse audiences in the chaos.

A Cosmic Spectacle

Guardians of the Galaxy Vol. 3 continued the series’s tradition of blending humor with breathtaking cosmic visuals. This installment pushed the envelope with its use of real-time rendering, enabling its artists to visualize complex space environments and characters on set.

The film brought together Wētā FX, Framestore and Sony Pictures Imageworks, among others, to create a whopping 3,000+ VFX shots. The dense, immersive 3D environments allowed for a seamless integration of live-action and CGI elements and characters, resulting in a visually stunning space opera that maintained the series’ signature style while exploring new visual territories.

One of Guardians’s greatest achievements is the hallway fight scene filmed at 120 frames per second and delivered as a single continuous shot with variable speed ramps and nonstop action.

Epic Storytelling Through Detailed VFX

The historical epic Napoleon was brought to life with meticulous attention to detail and scale. The film used various set extensions and practical effects to recreate the vast battlefields and period-specific architecture of early 19th-century Europe.

Advanced crowd simulation was used to depict the massive armies of Napoleon’s time, each soldier animated with individual behaviors to enhance the battle scenes’ realism. These touches, combined with high-resolution textures and dynamic lighting, created a visually compelling narrative grounded in reality.

Exploring AI’s Boundaries

The Creator explored the themes of AI and virtual reality, requiring VFX that could realistically depict advanced technology and digital worlds.

The film made significant use of CG animation and visual effects to create environments both futuristic and plausible. Director Gareth Edwards, also known for Rogue One and Godzilla (2014), has been widely applauded for delivering a film with the look of an expensive summer blockbuster using a fraction of the typical budget.

The portrayal of AI entities involved a combination of motion-capture and procedural animation to create characters that moved and interacted with complexity and fluidity at human level. The VFX team developed custom software to simulate the intricate patterns of digital consciousness, blurring the lines between the virtual and the real.

High-Octane Action Meets Precision VFX

For Mission: Impossible – Dead Reckoning Part One, the visual effects team faced the challenge of enhancing the film’s signature action sequences without detracting from the series’s reputation for practical stunts. To achieve this, they took a hybrid approach, using CGI to seamlessly augment practical effects.

High-speed drone footage integrated with CG elements created breathtaking chase scenes, while advanced compositing techniques added layers of detail and depth to explosions and hand-to-hand combat scenes, elevating the film’s action to new heights.

NVIDIANs at the SciTech Awards

Oscars-Blog-Copy-2-352x400.jpgNVIDIA’s Christopher Jon Horvath, joined by Steve LaVietes and Joe Ardent, on stage to accept their award.

The Academy Awards for Scientific and Technical Achievements highlight technical contributions that have significantly affected the way movies are made, as well as the brilliant inventors behind them.

OpenUSD was honored in the science and engineering subcategory for its importance as the first open-source scene description framework that streamlines the entire production workflow. Its innovative layering system and efficient crate file format have established it as the de facto standard for 3D scene interchange, facilitating unparalleled collaboration across the industry. 

The science and engineering subcategory also celebrated other remarkable technologies, including the OpenVDB open-source library, for sparse 3D volumes, which has become an industry standard for visual-effects simulations and renderings of water, fire, smoke and clouds.

Initially created in 2009 by Ken Museth, senior director of physics research at NVIDIA, OpenVDB has been further developed by Museth, Peter Cucka and Mihai Aldén. Learn more about the latest advancements in OpenVDB including NanoVDB and NeuralVDB.

In addition, the Alembic Caching and Interchange system, developed by Lucas Miller, NVIDIA’s Christopher Jon Horvath, Steve LaVietes and Joe Ardent, received recognition for its efficient algorithms in storing and retrieving baked, time-sampled data, facilitating high-efficiency caching and scene sharing across the digital production pipeline.

OpenVDB and Alembic are both interoperable with OpenUSD, enhancing their utility and integration within the industry’s production workflows.

See How Oscar-Nominated VFX Are Created at GTC

Learn more about visual effects, AI, virtual production and animation at NVIDIA GTC, a global AI conference taking place March 18-21 at the San Jose Convention Center and online. Register to hear from industry luminaries creating stunning visuals in film and TV.

Academy Award-winner Ken Museth will present a session, Open-Source Software for Visual Effects: OpenUSD and OpenVDB, on Monday, March 18, at 9 a.m. PT.

And join us for OpenUSD Day to learn how to build generative AI-enabled 3D pipelines and tools using Universal Scene Description. Browse the full list of media and entertainment sessions at GTC.

Featured image courtesy of Toho Co., Ltd. TOHO CO., LTD.

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Guidelines Privacy Policy.