Unity has unveiled a string of new features and updates for 2018 as the company stretches its reach to a variety of platforms and emerging technologies.
During the Unity keynote at GDC, the company outlined its plans to “democratise” machine learning with the release of new tech available for developers to integrate into their games.
It’s open-sourced AI toolkit Unity ML-agents, which can be used to train ‘agents’ (NPCs) in realistic and complex scenarios.
The release is free and features imitation learning. This means it can learn from real people playing the game and can be trained by developers to adjust to players.
Speaking at Unity’s GDC keynote, VP of AI and machine learning Danny Lange said that instead of building an NPC through conventional code, with this tech developers can create NPCs through imitation learning, which he claimed means NPCs will react to the game environment in a much more organic manner. Such training will occur in real-time.
“It won’t play perfectly like a robot, but imperfectly like a player,” he said.
During the keynote Unity also unveiled a new tool called IAP Promo. This will deliver personalised in-app promotions to players based on their behaviour.
This offering could be in the form of an ad, a virtual item to purchase or a promotion for another game entirely.
Unity’s pipeline for the year includes three new versions: 2018.1, 2018.2 and 2018.3.
Key new features coming to 2018.1 include new artist tools and rendering pipelines.
For 2018.2 meanwhile, Unity aims to release a real-time ray tracing GPU lightmapper, a new vector graphics importer, 2D character animation tools and new asset bundle tools.
In 2018.3, Unity will introduce nested prefabs - a move popular with developers during the keynote.
As well as these new versions, Unity is introducing long-term supported versions of the game engine. From 2017.3, developers can get access to an engine for 24 months without new features being added, just the bug fixes.
Unity CEO John Riccitiello was bullish on the company’s forays into virtual reality, augmented reality and mixed reality technology.
He claimed that 69 per cent of projects created for the Oculus Rift, 74 per cent of those made for HTC Vive, 87 per cent for Samsung Gear VR and 91 per cent for HoloLens were made with Unity.
Continuing its support for these devices, Unity will now support development on the Magic Leap One, with access available in Unity Preview starting today.
Developers can now also build directly for Oculus Go with the same workflow as they use for Gear VR, while standalone Google Daydream compatibility now includes support for the newly added six degrees of freedom.
Unity has revamped its core rendering architecture in a move called a “major conceptual shift for the way Unity graphics have been working”, according to director of global graphics Natalya Tatarchuk.
It’s introducing new Scriptable Render Pipeline in two forms: a high definition render pipeline (HD RP) and a lightweight rendering pipeline (LW RP).
HD RP is designed high-fidelity graphics for top end games on PC and console. LW RP meanwhile is optimised for platforms such as XR and mobile.
You can see an example of HD RP in action in Unity’s Book of the Dead demo below. Following the demo, Unity showed the level running in real-time on PS4.