Interview

Unity's Marc Whitten: "We believe AI won’t replace actual human effort - it will enhance it"

Unity's president of Create talks Unity 6, AI in game creation and the road ahead for the platform

Unity's Marc Whitten: "We believe AI won’t replace actual human effort - it will enhance it"

At Unity’s Unite event in Amsterdam the company is currently rolling out Unity 6 and outlining the capabilities of their Sentis and Muse AI-powered features alongside the introduction of Unity Cloud and their support for Apple’s visionOS.

Ahead of the event we caught up with Marc Whitten, president of Unity Create to find out more on what’s new and what the road ahead for Unity looks like.

Tell us about Unity Muse - what’s inside that package and what do they each allow devs to do?

Marc Whitten: Launching in early access at Unite, Unity Muse is a suite of AI-powered features to simplify content creation. These features include Chat which lets you use natural language prompts to find exactly what you’re looking for. In seconds, you’ll have well-structured answers and instructions, and even project-ready code. Using the Sprite capability, you can produce and modify 2D art directly in the Editor. And with Texture you can also generate production-quality textures for 2D and 3D projects, in any style, all inside your project in the Editor.

Sprite and Texture are powered by a custom-built deep learning model trained entirely on data and images that Unity owns or has licensed. This ensures the datasets do not contain any people, logos, or recognizable artistic styles. You can learn more about our responsible AI and enhanced model training here.

At Unite, we’re also offering a glimpse of the next set of Muse capabilities, currently in pre-release and fully supported later in 2024. They are Behavior which lets you instantly set up character interactions. Simply describe the desired actions, and Muse will create behavior trees in the Editor. For both aspiring and skilled animators, Animate lets you bring humanoid characters to life with just a few text prompts. And with Sketch, you can get started mocking up collaboratively on the web before importing your sample scene into the editor.

Just like a physics engine is one of the tools to bring games alive, we also believe that you‘ll need a neural engine to solve some complicated tasks and or create new functionality in your game
Marc Whitten

And that works alongside Unity Sentis? Tell us about the state of play for Sentis. What’s the key points to push here and how does it all fit together?

While Unity Muse helps enhance creativity and productivity, Unity Sentis enables developers to bring complex AI data models into the Unity Runtime to help solve complicated tasks and create new functionality in a game using AI models. Just like a physics engine is one of the tools to bring games alive, we also believe that you‘ll need a neural engine to solve some complicated tasks and or create new functionality in your game. You can take models from places like Tensorflow, Pytorch, Meta, OpenAI, or marketplaces like Keras or HuggingFace. Sentis uses the open ONNX file standard to import and optimize them for the Unity Runtime.

Additionally, because Sentis allows trained AI models to run locally on all Unity-supported devices, you can imagine and deliver experiences like smart interactive NPCs with open-ended quests, object recognition and more – without worrying about cloud compute costs or latency.

Are these AI tools about doing the brilliant work you already do, faster? Or is this about powering up smaller devs to make products that only big teams with deep pockets could make?

We believe that AI techniques will end up being important both in the creation of games but also their play. These two platforms - Muse and Sentis - are where we are building capabilities, integrating them into Unity, and learning with our creators.

Sentis, for example, is about giving game developers more options - at runtime - on how to deliver game features. At Unite, some of our early beta users will show off things like using a novel neural network to measure your breath in a VR.

Our goal is to meet developers where they are, help them, and keep them in control. Our tools can help accelerate productivity and enhance and supplement existing workflows. Ultimately, we want you to be able to iterate and experiment faster so you can find the fun in your games and experiences. You can lean on our tools to help you optimize your code, or create a variation of a texture you’re playing with, and much more.

Just like our other tools, we expect to embed our AI tools - like Unity Muse and Unity Sentis - in the Unity workflow so it's there when you need it if a developer chooses to leverage AI in their workflow.

As I use Unity’s AI tools am I training the Unity model? Is my content becoming part of what Unity could use as part of a solution for another developer as part of their project?

For features that are out of pre-release - for example Chat, Texture and Sprite - it’s up to you. We give users the option of volunteering their content data to help us train our models and improve the offering and experience of our AI capabilities. “Content data” is how we describe the input of the user or output of the model. This can include prompts, doodles, or other information used in generation. Additionally, you can also change your mind and if you’ve previously given consent for volunteering your content data. You can opt-out at any point.

For features in pre-release - Animate, Behavior and Sketch - content data is essential for making product improvements in this earlier stage of production, so access requires relevant data collection. If you would rather not offer your content data for this purpose of training, you always have the option to wait until features are out of pre-release to try them out.

So far gaming has got by just great on an individual's creativity and great minds approaching problems in different ways. With the introduction of AI into this mix isn’t there a risk that AI systems are going to find THE definitive way to solve a problem and maybe devs are going to produce the same content at the end of the day?

We see AI as a tool to enhance human effort - the humans at the other end, using the tool, play a pivotal role in injecting their own creativity and vision to the project they’re developing. The gaming industry is and has always been a breeding ground for delivering some of the most innovative and groundbreaking experiences in entertainment. This is driven by the talent in this space. AI won’t change that - but it will help accelerate the creation workflow, boosting productivity, so developers can put more focus on delivering the secret sauce that makes their games special.

We give users the option of volunteering their content data to help us train our models and improve the offering and experience of our AI capabilities
Marc Whitten

What about the worries of artists and creatives that they’re essentially getting cut out of the mix? While there’ll always be a place for the artist, writer or hardcore coder, it’s looking increasingly like we won’t be needing so many of them. What do you think?

No one produces final code or art in one shot. It’ll still take a lot of iteration, which is why we believe AI won’t replace actual human effort - it will enhance it. Many tools make it easier for creators to realize or express their intent and we expect tools like Muse to do just that - keep artists and creators at the center and give them more tools that they can use to iterate, refine, and mix together in order to realize their vision.

And Unity Cloud makes working across projects faster and easier for teams big and small? What’s the big key plus that Unity Cloud is bringing?

Modern game development requires a diverse set of tools, putting the burden on studios to navigate the changing landscape and growing complexity of making games. Unity Cloud provides new capabilities for collaboration, asset management, and team administration that support workflows for every person in your studio across every stage of development.

The key benefits include getting games to market faster. Optimize your production pipeline with automated integrations across development tools to help every member of your studio.

Unify content across your studio. Unify your content across your studio in a digital asset management system tailored for the rigors of game development, embedded directly in the Unity dashboard.

Team administration and centralized payment. We provide studio admins with role-based permissioning, solution provisioning, and payments to centralize control over your development teams and development pipeline.

And it works with any engine. Unity Cloud is designed to be flexible and extensible enough to work with any game engine, any tech stack, and any platform.

And Apple Vision Pro support. Is that something you feel you have to offer or do you see this hardware and subsequent versions of AR/VR tech really taking off?

As it relates to XR and spatial computing, we believe that something big is happening in virtual, augmented and mixed reality. The space is evolving quickly, and we’re investing deeply to move with it. Made with Unity games make up the majority of content on the latest XR platforms, like Meta Quest 3 and PlayStation VR 2. And we’re committed to making sure that we’re ready to help developers release on emergent platforms from one.

We saw really strong enthusiasm from our developer community for our visionOS beta program - thousands of developers, from 145 countries, registered interest to participate. At Unite, we’re announcing that the Unity visionOS Beta will be open to all Unity Pro, Enterprise, and Industry customers, enabling more developers to explore the latest tech.

Core to what we do at Unity is to help developers get their games in as many customers’ hands as possible, no matter what device they’re playing on, and we’re really excited about the energy surrounding XR devices.

How’s the roadmap ahead for Unity business looking? It's obvious that you need to get paid for the service you're providing. Do you feel you’ve now got a fair balance in the worth you’re providing and the value your users are perceiving?

When we think about the roadmap ahead, we are laser-focused on delivering value to our Unity users and this means doubling down on the area that they care about the most, the engine. At Unite, we’ll give our community a preview of the next major, supported version of Unity, Unity 6 - formerly 2023 LTS. Unity 6 will come with major performance enhancements and accelerated multiplayer game creation and scale - all to help developers ideate, iterate, and create with more AI tools, optimized WebGPU support, and innovative VR device support. We want Unity 6 to be the best version of Unity that has ever existed. We are focused on delivering features and improvements that matter to our creators.


Editor - PocketGamer.biz

Daniel Griffiths is a veteran journalist who has worked on some of the biggest entertainment media brands in the world. He's interviewed countless big names, and covered countless new releases in the fields of videogames, music, movies, tech, gadgets, home improvement, self build, interiors and garden design. Yup, he said garden design… He’s the ex-Editor of PSM2, PSM3, GamesMaster and Future Music, ex-Deputy Editor of The Official PlayStation Magazine and ex-Group Editor-in-Chief of Electronic Musician, Guitarist, Guitar World, Rhythm, Computer Music and more. He hates talking about himself.