Unreal Engine 4.22 released

 Recent News
Spread the love

Sound waves can now be pre-analyzed for envelope and spectral energy to drive Blueprints during playback. This allows sound designers to create compelling audio-driven systems, while offloading the spectral analysis work to improve runtime performance. In addition, analysis data from a proxy sound wave can be substituted for a sound wave’s analysis data, allowing designers to spoof isolated sound events when trying to drive gameplay. 

We have made significant improvements to sound asset importing. In addition to our current support for multi-channel WAV files, the Unreal Audio Engine now supports importing a wide variety of sound file formats, including AIFF, FLAC, and Ogg Vorbis.

The MIDI Device Plugin now allows users to send MIDI messages to external MIDI devices in addition to processing MIDI Device input. In addition, improvements to the Plugin’s Blueprint API simplify parsing MIDI messages. These changes allow sound designers to more effectively integrate MIDI I/O into their project.

The industry-leading Sequencer linear animation toolset once again received significant updates with new tools and enhancements that benefit virtual production projects along with a host of quality of life improvements.

Take Recorder enables fast iteration of recording performances and quickly reviewing previous takes for virtual production workflows! Building upon the foundations of Sequence Recorder, we improved what you can record and how the data is handled as well as made the system extendable to fit the needs of different use cases. You can easily record animations from motion capture linked to characters in the scene as well as actual Live Link data for future playback. By recording Actors into subsequences and organizing them by Take metadata, productions of all sizes and any number of takes can easily be accommodated. 

The new Composure track enables you to easily export a Sequence as layers defined in Composure. Multiple tracks can be added to export more than one layer at a time. You can drag layers straight from Composure into Sequencer to generate tracks for those layers.

You can now create layered animations using multiple weighted sections in a single track. Layered animations are supported in Transform tracks as well as several other property tracks.

You can now record incoming live link data onto sequencer tracks and play them back. Live link data that comes with timecode and multiple samples per engine tick can be saved at a resolution higher than the sequencer playback rate.

You can now change the Object – Static Mesh, Skeletal Mesh, Material, etc – assigned to a property on an Actor in Sequencer. Once the Actor is added to Sequencer, these Object reference properties are available like other properties in the Track menu for that Actor.

Level sequence actors that are set to replicate using the Replicate Playback property on Level Sequence Actors now synchronize their playback time between server and client.

 tool enables you to set a fixed budget per platform (ms of work to perform on the gamethread), and it works out if it can do it all, or if it needs to cut down on the work done. IT works by measuring the total cost of animation updates, and calculating the cost of a single work unit. If work needs to be cut down to meet the budget, it does it based on significance and targets several areas: stop ticking and use Master Pose Component, update a lower rate, interpolate (or not) between updates, etc. The goal is to dynamically adjust load to fit within fixed (gamethread) budget.

 plugin that reduces the overall amount of animation work required for a crowd of actors. It is based upon the Master-Pose Component system, while adding blending and additive Animation States. The Animation states are buckets for which animation instances are evaluated. The resulting poses are then transferred to all child components part of the bucket. See the diagram below for a breakdown of the system.

The video below shows Skeletal Mesh components representing the individual states and additionally instances used for blending between states. It also shows the crowd, which are all individual actors for which an Animation State is determined. According to the Animation State the Skeletal Mesh Components are connected to the Master Components for driving their Animation. In the video only the Poses for the Animation States are evaluated, the Crowd Actors only copy the resulting bone transforms.

New: Support for long filenames (Experimental)

We added support for long file paths for users with Windows 10 Anniversary Update! Historically, paths in windows have been limited to 260 characters, which can cause problems for projects with complex naming conventions and deep hierarchies of assets. The Windows 10 Anniversary Update adds support for much longer filenames, on the condition that the user and each application opt-in to it. To enable long file paths in Windows 10: Note: Support for long paths is not universal, and third party tools – even parts of the operating system, like Windows Explorer – may not support them correctly.

New: Blueprint Indexing Optimizations

Changes to how we index Blueprint search data have significantly improved Editor and Play-In-Editor startup times. We now defer search data updates until a Find-in-Blueprint tab is opened, perform updates asynchronously, and separate Blueprint re-indexing from the Asset loading process.

New: Improved Steamworks support

Using UE4 with Steam has never been easier! We’ve made several improvements to usability and quality of life for developers of multiplayer games on Steam.

  • Dedicated Servers on Steam can now receive custom names (up to 63 characters) with the new “-SteamServerName” launch argument.
  • Projects can now override the Steam network layer by deactivating the “bUseSteamNetworking” configuration value and setting their NetDriver configurations to the preferred underlying network layer.
  • We have greatly improved the usability of Steam NetDrivers with UE4 Beacons in addition to standard game networking.
  • You can now set certain required Steam values, such as dedicated server names, or the application ID, in your project’s Target.cs file. Making changes to these values will no longer require recompiling the engine.

New: Preview Scene Settings Improvements

We added functionality to Preview Scene Settings that enables you to hide Environment Cubemap (Show Environment]) without disabling lighting. See the Use Sky Lighting property in the Preview Scene Settings panel. 

New: Skeletal Mesh LOD Reduction

Use the Skeletal Mesh Reduction Tool to generate versions of a Skeletal Mesh with reduced complexity for use as levels of detail (LODs) all within Unreal Editor! You no longer need to rely on external Digital Content Creation (DCC) programs or third party tools that can be very time consuming and error prone. Create accurate, high-quality levels of detail and see the results immediately in the viewport.

For additional information, see Skeletal Mesh Reduction Tool.

New: Per Platform Properties Improvements

Per Platform Properties have been extended to allow for setting values based on Target Platform in addition to the Platform Groups.

New: Gauntlet Automation Framework Improvements

The Gauntlet automation framework received several improvements focusing on usability, documentation, and examples to learn from.

Expanded documentation & samples

  • Documentation about Gauntlet architecture and getting started
  • Additional ActionRPG and Profile Guided Optimization examples
  • Example of tracking editor load and PIE times

iOS Support

Gauntlet now supports installing and running IPA files on iOS (requires Mac host). This takes our device support to PC, Mac, PS4, XB1, Switch, Android, and iOS.

Profile Guided Optimization

An example script for automation of  Profile Guided Optimization (PGO) file generation on PS4, XboxOne, and Switch for your project. 

Report Creation

Added HTML and Markdown builders for creating custom reports as part of automation scripts.

New: Visual Studio 2019 Support

Support for Visual Studio 2019 has been added. To use Visual Studio 2019 by default, select “Visual Studio 2019” as the IDE from the editor’s source control settings.

We’ve also added support for switching to newer C++ standard versions. To change the version of the C++ standard that your project supports, set the CppStandard property to one of the following values from your .target.cs file.

 

Version

Value

C++14

CppStandardVersion.Cpp14

C++17

CppStandardVersion.Cpp17

Latest

CppStandardVersion.Latest

At the same time, we’ve deprecated support for Visual Studio 2015. If you want to force your project to compile with the Visual Studio 2015 compiler, you can set WindowsPlatform.Compiler = WindowsCompiler.VisualStudio2015 from your project’s .target.cs file. Note that the version of the engine downloaded from the Epic Games Launcher does not support Visual Studio 2015, and we no longer test it internally. 

New: Subsystems

Subsystems are automatically instanced classes with managed lifetimes which provide easy to use extension points without the complexity of modifying or overriding engine classes, while simultaneously getting Blueprint and Python exposure out of the box.

Currently Supported Subsystem Lifetimes

Engine

class UMyEngineSubsystem : public UEngineSubsystem { ... };

When the Engine Subsystem’s module loads, the subsystem will Initialize() after the module’s Startup() function has returned. The subsystem will Deinitialize() after the module’s Shutdown() function has returned.

These subsystems are accessed through GEngine:

UMyEngineSubsystem MySubsystem = GEngine->GetEngineSubsystem();

Editor

class UMyEditorSubsystem : public UEditorSubsystem { ... };

When the Editor Subsystem’s module loads, the subsystem will Initialize() after the module’s Startup() function has returned. The subsystem will Deinitialize() after the module’s Shutdown() function has returned.

These subsystems are accessed through GEditor:

UMyEditorSubsystem MySubsystem = GEditor->GetEditorSubsystem();

Note: These Editor Only subsystems are not accessible to regular Blueprints, they are only accessible to Editor Utility Widgets and Blutility Classes.

GameInstance

class UMyGameSubsystem : public UGameInstanceSubsystem { ... };M

This can be accessed through UGameInstance:

UGameInstance* GameInstance = ...;
UMyGameSubsystem* MySubsystem = GameInstance->GetSubsystem();

LocalPlayer

class UMyPlayerSubsystem : public ULocalPlayerSubsystem { ... };

This can be accessed through ULocalPlayer:

ULocalPlayer* LocalPlayer = ...;
UMyPlayerSubsystem * MySubsystem = LocalPlayer->GetSubsystem();

Accessing Subsystems from Blueprints
Subsystems are automatically exposed to Blueprints, with smart nodes that understand context and don’t require casting.

You’re in control of what API is available to Blueprints with the standard UFUNCTION() markup and rules.

Subsystems from Python
If you are using Python to script the editor, you can use built-in accessors to get at subsystems:

my_engine_subsystem = unreal.get_engine_subsystem(unreal.MyEngineSubsystem)
my_editor_subsystem = unreal.get_editor_subsystem(unreal.MyEditorSubsystem)

Note: Python is currently an experimental feature.

New: Editor Utility Widgets

Editor Utility Widgets enable you to extend the functionality of Unreal Editor with new user interfaces created entirely using the UMG UI Editor and Blueprint Visual Scripting logic! These are Editor-only UI panels that can be selected from the Windows menu like other Unreal Editor panels.

To create an Editor Utility Widget, right-click in the Content Browser and select Editor Utilities > Editor Widget.

To edit the Blueprint, double-click on the Editor Widget Asset. Once you’ve edited the Blueprint for your Editor Widget Asset, right-click the Editor Widget and select Run Editor Utility Widget to open the UI in a tab. The tab is only dockable with Level Editor tabs. It appears in the Level Editor’s Windows dropdown, under the Editor Utility Widgets category. This is an experimental feature. 

New: Material Analyzer

The Material Analyzer enables you to get a holistic view of parameter usage in Materials and Material Instances so you can quickly find opportunities to consolidate and optimize your Material Instances to minimize switching render state and save memory. The Material Analyzer can be found under Window > Developer Tools.

Materials are listed in a tree with a list of suggestions which show groups of material instances with the same set of static overrides so you can make optimizations. You can also place all the related instances into a local collection, so you can easily find and update them.  

New: Select Child and Descendant Actors

You can now extend your selection to all the immediate children or all the descendants of your selected Actor using the context menu in the World Outliner and the Level Viewport, making it easier to work with large, complex scene hierarchies.

New: Scaled Camera Zoom and Pan

When you have one or more objects selected in the Level Viewport, the sensitivity of camera zoom and pan operations now scales automatically with the distance between the objects and the camera. This makes your camera movements feel more natural, especially when you’re working with objects at extreme sizes, such as tiny mechanical parts or large landscapes.

You can return to the previous behavior by disabling the new Use distance-scaled camera speed setting in the Level Editor > Viewports section of the Editor Preferences window. 

New: Orbit Around Selection

You can now make the camera orbit around the pivot of the selected objects – as opposed to orbiting around the center of the screen – when one or more objects are selected in the Level Viewport.

To activate this mode, enable the new Orbit camera around selection setting in the Level Editor > Viewports section of the Editor Preferences window. 

New: Toggle Multiple Layer Visibility

You can now toggle the visibility of multiple Layers at the same time. Hold CTRL and click each Layer to build your selection. Then click the eye icon next to any of those selected Layers to toggle visibility of all selected Layers.

New: Multi-User Editing (Early Access)

Multiple level designers and artists can now connect multiple instances of Unreal Editor together to work collaboratively in a shared editing session, building the same virtual world together in real time.

  • A dedicated server keeps track of all the modifications made by all users, and synchronizes the state of the Editor between all connected computers.
  • When you make changes in your Levels and Sequences on one computer, the changes are automatically mirrored, live, to all other computers that are part of the same session.
  • When you make changes to other types of Assets, like Materials, the changes are replicated to all other computers when you save those Assets.
  • Before leaving an editing session, each user can choose whether they want to apply the changes made during that session to their local copy of the Project.

New: Preview Rendering Level Improvements

The workflow for the Mobile Previewer has improved when working with different devices and platform’s shading models in order to consistently use the same shading model across all Editor viewports and to enable you to instantly switch between the default Shader Model 5 (SM5) and a selected Preview Rendering Level.

Start by selecting a Preview Rendering Level from the main toolbar Settings drop-down to compile shaders for a platform. Once compiled, use the added Preview Mode button in the main toolbar to toggle the view mode.

For additional details, see Mobile Previewer.  

New: Dynamic Spotlight Support on Mobile

We now support non-shadow casting Dynamic Spot Lights on high-end mobile devices. 

You can enable dynamic Spot Lights from the Project Settings > Rendering > Mobile Shader Permutations by setting Support Movable Spotlights to true. 

For additional information, see Lighting for Mobile Platforms

New: SaveGame System iCloud Support

UE4 now supports saving games to iCloud using the ISaveGameSystem interface, on both iOS and tvOS. You can enable saving games to iCloud by going to Project Settings > Platforms > iOS > Online and enabling the Enable Cloud Kit Support option. Then from the iCloud save files sync strategy option, you can select the sync strategy that works best for your project. The currently available iCloud sync options are as follows:

  • Never (Do not use iCloud for Load/Save Game)
  • At game start only (iOS)
  • Always (Whenever LoadGame is called)

New: Device Output Window Improvements

Major improvements have been made to the Device Output Log window, bringing it out of the Experimental state. You can use the Device Output Log window to send console commands to iOS devices from the PC. To access the Device Output Log, from the main menu click Window >Developer Tools > Device Output Log

New: HTML5 Platform Improvements (Experimental)

We have added experimental multithreaded support for HTML 5 projects. Please note you need access to the Unreal Engine 4 source code to enable this functionality.

Some browsers will need special flags enabled in order to run in multithreaded mode. See https://github.com/emscripten-core/emscripten/wiki/Pthreads-with-WebAssembly for more information.

  • For Chrome: run chrome with the following flags: 

--js-flags=--experimental-wasm-threads --enable-features=WebAssembly,SharedArrayBuffer

These can alternatively be enabled/disabled in chrome://flags/#enable-webassembly-threads as “WebAssembly threads support”
 

  • In Firefox nightly, SharedArrayBuffer can be enabled in about:config by setting the javascript.options.shared_memory preference to true.

New: iOS Preferred Orientation

You can now set the preferred orientation to be used as the initial orientation at launch for iOS devices when both Landscape Left and Landscape Right orientations are supported. 

New: Niagara Vector Field Data Interface

The Vector Field Data Interface now works the same for both CPU and GPU particles! You can use the Sample Vector Field module to sample vector fields. It exposes three primary inputs:

  • VectorField: This is the Vector Field Data Interface instance, containing the static vector field object itself, and per-axis tiling flags.
  • SamplePoint: This is the point where the vector field is sampled. This defaults to Particles.Position, but this can be customized.
  • Intensity: This scales the sampled vector.

There are also multiple optional inputs:

  • ApplyFalloff: Check this to apply a falloff function to the sampled vector, so the influence of the vector field approaches zero towards the edges of the vector field’s bounding box.
  • UseExponentialFalloff: Check this to make the falloff function be exponential instead of linear.
  • FalloffDistance: When applying a falloff function, this parameter determines how far from the bounding box edges the falloff applies.
  • FieldCoordinates: This makes it possible to override the Emitter’s Localspace parameter. It has three options:
    • Simulation: Uses the Emitter.Localspace parameter.
    • World: This overrides the position and transform of the vector field so that it is always relative to the world origin, regardless of the Emitter.Localspace parameter.
    • Local: This overrides the position and transform of the vector field so that it is always relative to the System itself, regardless of the Emitter.Localspace parameter.
  • FieldTranslate: This offsets the vector field relative to the origin as defined by FieldCoordinates.
  • FieldRotate: This reorients the vector field relative to the origin as defined by FieldCoordinates.
  • FieldScale: This rescales the vector field.

The SampleVectorField module provides a lot of utility functionality, and therefore it might include some overhead. You can use specialized vector field sampling, by including the SampleField node on a Vector Field Data Interface object in the script editor. 

Note: The input expected here will be relative to the volume of the vector field itself, as no transformations are applied for you. 

An example for easily visualizing and using a vector field is included, called VectorFieldVisualizationSystem

New: Niagara Curl Noise Data Interface

The Curl Noise Data Interface now generates procedural curl noise based on an underlying simplex noise function and the results are identical for both CPU and GPU emitters. It is recommended to use the SampleCurlNoiseField module to generate curl noise for your particles. This module has two primary inputs exposed:

  • Strength: This scales the output vector generated by the module.
  • Length Scale: This describes the approximate size of the vortices generated by the curl noise.

And three optional inputs:

  • Offset: This is used to pan the noise field.
  • Noise Field: This is the Data Interface object itself, primarily used for adjusting seeds.
  • Sample Point: This specifies where to sample from. Defaults to Particles.Position, but other values can also be used.

Note: The curl noise field does not inherently tile, and does not suddenly end due to its procedural nature. To get a tiling curl noise field, consider using the Vector Field Data Interface instead, with a periodic volume texture curl noise as a vector field. 

New: Deterministic Random Number Generation in Niagara

We added support for deterministic random number generation for both CPU and GPU Niagara emitters. The behavior of the random number generated can be controlled globally from the Emitter Properties module, with the following options:

  • Determinism: A flag to toggle between deterministic or non-deterministic random numbers for the entire emitter.
  • Random Seed: A global seed used by the deterministic random number generator.

The RandomRange Niagara function is the recommended way to generate random numbers inside scripts. It now accepts the following:

  • Min: This defines the lower bound of the random numbers generated. It can be any integer or float type.
  • Max: This defines the upper bound of the random numbers generated. It can be any integer or float type.
  • RandomnessMode: This is an enum controlling the determinism mode of the random number generator, and it can be:
    • Simulation Defaults: This is the default behavior; it inherits the value of Emitter.Determinism.
    • Deterministic: Uses the deterministic random number generator.
    • Non-deterministic: Uses the non-deterministic random number generator.
  • OverrideSeed: This determines whether or not to override the seed specified by Emitter.GlobalSeed.
  • Seed: This value is used to override Emitter.GlobalSeed if OverrideSeed is enabled.

The last three, RandomnessMode, OverrideSeed, and Seed are initially hidden, but they can be revealed by clicking the arrow at the bottom of the node.

An alternative way to generate deterministic random numbers is by explicitly passing seeds to the Seeded Random node. For example, a specialization of the Random Range functionality could look something like the following:

The deterministic random numbers are slightly more expensive than the non-deterministic counterparts: somewhere between 1.5 and 2.0 times, depending on the type generated. In general, if you need more than one random number in a script, we recommend that you generate them with a single call, and split the result if you need to handle them separately.

New: Additional Inputs for Niagara Math Operations

Many of the script math operations now support an arbitrary number of input pins which can be added by clicking the Add (+) button or by connecting to the pin next to the Add button.

New: Support for Deprecating Niagara Scripts

Scripts for modules, dynamic inputs, and functions can now be marked as deprecated in the script editor.  Emitters and systems using deprecated scripts will now display errors in the UI, and deprecated scripts will not show up in the menus used to add them.

New: Niagara Initialization Modules

New modules have been added which expose the most common attributes used when initializing particles.

New: Select by Simulation Target Node for Niagara

The new Select by Simulation Target node enables you to execute different logic depending on whether an emitter is running in the CPU vector machine or in a GPU compute script. In general, most scripts should run identically on both simulation targets. However, this is not always possible, especially when making data interface calls. In cases where exact parity isn’t available, this new node gives the module author more tools to build consistent behavior. For an example of how this is used, see the new collision response module. 

New: Collision System for Niagara

Niagara collisions have been completely rewritten to support ray-trace-based CPU collisions, CPU+GPU analytical plane collisions, GPU scene depth, and distance field collisions. 

Additional features include:

  • Stability has been vastly improved across the board, in comparison to previous Niagara and Cascade implementations.
  • CPU collisions support the incorporation of the scene’s physical material characteristics, such as restitution and friction coefficients, and offers several integration schemes.
  • The system has been written as a single module to improve usability.
  • Collisions now work in combination with all renderers.
  • A configurable “rest” state allows particles to remain stable in particularly challenging situations.
  • The equations are physically based/inspired, and work with mass and other system properties.
  • A number of advanced options have been exposed, including static, sliding and rolling friction.
  • Collision radii are automatically calculated for sprites and meshes. Optionally, you can specify this parameter directly.

New: Platform SDK Upgrades

In every release, we update the Engine to support the latest SDK releases from platform partners.

  • IDE Version the Build farm compiles against
    • Visual Studio – Visual Studio 2017 v15.6.3 toolchain (14.13.26128) and Windows 10 SDK (10.0.16299.0)
      • Minimum Supported versions
        • Visual Studio 2017 v15.6
      • Requires NET 4.6.2 Targeting Pack
    • Xcode – Xcode 10.1
  • Android
    • Android NDK r14b (New CodeWorks for Android 1r7u1 installer will replace previous CodeWorks on Windows and Mac; Linux will use 1r6u1 plus modifications)
    • Note: Permission requests are now required on a per-feature basis. (for example: RECORD_AUDIO, CAMERA). For more information, see Android Upgrade Notes.
  • HTML5
    • Emscripten 1.37.19
  • Linux “SDK” (cross-toolchain)
  • Lumin
  • Steam
  • SteamVR
  • Oculus Runtime
  • Switch
    • SDK 7.3.0 + optional NEX 4.4.2 (Firmware 7.x.x-x.x)
    • SDK 6.4.0 + optional NEX 4.6.2 (Firmware 6.x.x-x.x)
    • Supported IDE: Visual Studio 2017, Visual Studio 2015
  • PS4
    • 6.008.061
    • Firmware Version 6.008.021
    • Supported IDE: Visual Studio 2017, Visual Studio 2015
  • XboxOne
    • XDK: July 2018 QFE-4
    • Firmware Version: December 2018 (version 10.0.17763.3066)
    • Supported IDE: Visual Studio 2017
  • macOS
  • iOS
  • tvOS

https://www.unrealengine.com/blog/unreal-engine-4-22-released