With Muse, Unity aims to provide developers with generative AI that is both useful and ethical

Unity joins the rest of the gang in providing generative AI tools for its users, but is careful (unlike others) to ensure that the tools are built on a solid non-theft-based foundation. Muse, the new suite of AI-powered tools, starts with texture and sprite generation, and graduates to animation and coding as it matures.

The company announced these features along with a cloud-based platform and the next big version of its engine, Unity 6, at the Unite conference in San Francisco. After a tumultuous couple of months — a major product plan was completely derailed and the CEO was fired — they’re probably eager to get back to business as usual, if that’s possible.

Unity has previously positioned itself as the champion of small developers who lack the resources to use a more extensive development platform like rival Unreal. As such the use of AI tools can be seen as a helpful addition to devs who can’t, for example, afford to spend days creating 32 slightly different wood wall textures in high definition. .

Although there are many tools to help create or modify such properties, saying “make more like this” without leaving your main development environment is always preferred. The simpler the workflow, the more one can do without worrying about details like formatting and siled resources.

AI assets are also often used in prototyping, where things like artifacts and somewhat janky quality – usually present regardless of the model these days – are not of any real importance. But illustrating your gameplay concept with original, relevant art rather than stock sprites or free sample 3D models can make the difference in getting one’s vision across to publishers or investors. .

Examples of sprites and textures created by Unity’s Muse.

Another new AI feature, Sentis, is harder to understand – Unity’s press release says it “enables developers to bring complex AI data models into the Unity Runtime to create new experiences and gameplay features.” So it’s a BYO model kind of thing, with some built in functionality, and it’s currently in open beta.

AI for animation and behavior is on the way, to be added next year. These highly specialized scripting and design processes can benefit greatly from a generative first draft or multiplicative helper.

Image Credits: Unity

A big part of this release, emphasized the Unity team, is to ensure that these tools do not live in the shadow of the IP infringement cases that will come. As fun as image generators like Stable Diffusion are to play with, they’re built with the assets of artists who never let their work be digested and regurgitated.

“In order to provide useful outputs that are safe, responsible, and respectful of other creators’ copyrights, we challenged ourselves to innovate our training techniques for the AI ​​models that drive sprite creation and Muse texture,” reads a blog post on responsible AI techniques that accompanies the announcement.

The company says it uses a fully custom model trained on Unity’s proprietary or licensed image. Although they used Stable Diffusion to, in effect, create a larger synthetic dataset from the smaller curated ones they gathered.

Image Credits: Unity

For example, this wood wall texture can be rendered in many variations and color types using the Stable Diffusion model, but no new content is added, or at least that’s how they describe it as working. . As a result, however, the new dataset is not only based on responsible source data, but a step removed from it, which reduces the possibility that a particular artist or style is imitated.

This method is safer, but Unity admits that it has lower quality in the initial models it produces. As mentioned above, however, the actual quality of the generated assets is not always of high importance.

Unity Muse costs $30 per month as a standalone offering. Soon we will hear from the community, no doubt, if the product justifies its price.

Leave a comment