JAKARTA - Epic Games recently released Unreal Engine 5.1 as an update of the Unreal Engine 5 version, which is expected to build the metaverse.

The new update is aimed at making Unreal Engine 5.1 easier and faster for 3D content creation.

The company has added many new, time-tested features, making it more powerful, efficient, and versatile for content creators across all industries.

What features did Epic Games bring to Unreal Engine 5.1? Check out the explanation below, quoted on Monday, November 28.

Games

Epic Games says that because more than half of all announced next-gen games are being built with the Unreal Engine, developers can now take advantage of updates to Lumen's dynamic global illumination and reflection system, Nanite's virtualized micro polygon geometry system, and Virtual Shadow Maps (VSM) that lay the foundation for games and experiences running at 60 frames per second (fps) on capable next-gen consoles and PCs.

The enhancements will enable fast competition and detailed simulations without latency. Additionally, Nanite has also added a programmable Rasterizer to enable material-driven animation and deformation via World Position Offset, as well as opacity masks.

This paved the way for artists to use Nanite to program the behavior of specific objects, such as Nanite-based leaves.

Unreal Engine 5.1 also adds several features to improve efficiency for developers of games and other large-scale interactive projects, helping teams to be more productive. For example, virtual assets separate metadata from object data, allowing developers to sync only what they need from system source control such as Perforce, resulting in smaller workspaces and faster syncs for developers who don't need access to the full object data.

Pipeline State Object (PSO) caching for DX12 simplifies the process required to prepare game delivery on DX12. And finally, on-demand shader compilation only compiles the shaders needed to render what's visible on the screen when working in the Unreal Editor, which can result in time savings and increased interactivity.

For developers building massive open worlds, Unreal Engine 5.1 also brings additional functionality and an improved workflow.

World Partition now supports Large World Coordinates, enabling the creation of large open worlds without loss of precision. Users can also enjoy workflow-accelerated source control with World Partition, thanks to managing, filtering, searching, and viewing files and change lists.

It's also now easier to find content in the world from within a user's change list, and vice versa. Additionally, new HLOD (Hierarchical Level of Detail) support for water rendering and streaming allows users to create large bodies of water in the open world with better performance and a smaller memory footprint.

In-Camera Visual Effects

The Unreal Engine is now used in more than 425 film and TV productions, and is integrated into more than 300 virtual production stages worldwide. With the enhancements in Unreal Engine 5.1 designed specifically for virtual production workflows, engineers and artists now have many benefits, including In-Camera VFX Editor, improved Light Card system, improved Remote Control API, expanded color correction tools, support Initial lumens for nDisplay, and more.

First, LED stage operators can now take advantage of a new dedicated In-Camera VFX Editor (ICVFX) that supports a variety of virtual production workflows. This largely eliminates the need for stage operators to hunt through the Outliner for specific objects and controls. Unreal Engine 5.1 also adds UI, UX, and performance improvements to the Remote Control API, letting users create powerful custom browser-based remote controls more quickly and easily.

The ICVFX Editor also hosts an interface to an enhanced Light Card system that is displayed as a preview from the nDisplay wall. As well as making it intuitive and efficient to create, move, and edit light cards and save templates, the new light cards can maintain their shape on walls, eliminating distortion.

New to the ICVFX Editor are Color Correction Windows (CCWs), allowing color adjustments to be applied exclusively to anything behind them (similar to Power Windows in color grading applications), along with the ability to apply color correction per actor, which reduces the need for camouflage complex.

In Unreal Engine 5.1, the new Media Plate Actor supports OpenEXR, allowing users to drag and drop footage from the content browser. Additionally, users can now play back mapped and mount compressed EXR both on the machine and with nDisplay with the appropriate SSD RAID, and now have the ability to convert EXR to the correct format for optimal playback.

Furthermore, the Unreal Engine virtual camera system has also been overhauled with a new foundational framework that uses Epic's Pixel Streaming technology to improve responsiveness and reliability, and an updated UI with a modern, camera-focused design that will be more familiar to camera operators.

Users also now have the ability to connect hardware devices, and will be able to customize the UI in the future.

Lumen, the fully dynamic global illumination and reflection system from the Unreal Engine, now offers initial support for nDisplay in 5.1, provided the number of lights is modest (around 5-7 lights in total, depending on the graphics card).

With Lumen, indirect lighting adapts quickly to changes in sun angle, light, or position bounce cards.

Previously, this change required a baking step which could halt production, interrupting the creative flow. UE 5.1 ​​also adds improvements to the Lightmass GPU, including support for Sky Atmosphere, Stationary Sky Lights, lightweight features such as IES profiles and Rect Light textures, as well as overall quality and performance improvements.

Animations

The use of the Unreal Engine in animation has grown exponentially, from 15 productions between 2015 and 2019 to more than 160 productions from 2020 to 2022.

For professionals working with animated content, especially characters, Unreal Engine 5.1 offers several important advances to the built-in animation engine and rigging tools, as well as the Sequencer.

Now in Beta, Machine Learning (ML) Deformer generates high-fidelity estimates of nonlinear deformers, complex proprietary rigs, or any arbitrary deformation by using custom Maya plugins to train machine learning models, which in turn run in real-time on the Unreal Engine.

This allows the user to simulate film-quality deformations, such as flexing muscles, bulging veins, and shifting skin. Other character deformation improvements include an improved Deformer Graph Editor for easier graph creation and editing.

In addition, rig control continues to progress towards fully procedural rigging, increasing the impact and scalability of the rigging team. Updates to the core framework include new build events that allow users to create rig hierarchies via graphs, and custom user events to create and trigger rig events such as "Snap FK to IK".

With this update, artists can create a single buildable control rig asset itself to suit characters that may have different proportions and skeletal traits. Like, the same control rig can adapt itself to a three-toed monster or a five-toed human. without any changes to the rig assets.

Unreal Engine 5.1 also adds support for limitations in Sequencer, a nonlinear multi-track animation editor engine including position, rotation, and appearance. Users can take advantage of this on the fly to animate the connection between control rigs or any actor.

For example, having the camera always follows the character, keeping the character's hands on the steering wheel, animating a clown juggling a ball, or constraining a cowboy's hips so that he sits naturally in the saddle when the horse moves, while hands are on the reins.

The sequencer also saw additional functionality exposed via Blueprints and Python Scripts, and a refactored UI/UX for increased stability and extensibility, as well as improved workflows for creating and editing animations.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)