AU Class
AU Class
class - AU

From In-House to Off-the-Shelf: USD in production at Animal Logic

Share this class

Description

We’d like to share our journey of how we integrated Universal Scene Description (USD) in Maya into our production pipeline. We’ll set the stage by describing life in Maya prior to USD, when we used a generative scene building framework called Shotsetup. This tool had its strengths and weaknesses, and led us to adopt USD for its promised functionality. We’ll share a technical description of the choices we made, presenting the main artist front-facing tools fueled by AL_USDMaya (EnvironmentStudio2, Forge2). We then made the decision to open source the plugin, and will share the reasons why and the lessons learned. Finally, we will describe the collaboration with Autodesk around the unifying goal: from 3 to 1 plugin.

Key Learnings

  • Understand the history behind USD and the production challenges it solves
  • Learn about the collaboration with Pixar, Animal Logic, Luma Pictures, and Blue Sky behind the USD plugin for Maya
  • Understand how USD workflows have been integrated into the latest version of Maya
  • Get a glimpse into the future of USD workflows at Autodesk

Speaker

  • Fabrice Macagno
    I have been working in the VFX industry for almost 20 years, joining Animal Logic (Sydney) 5 years ago, where I lead the Scene Description team: our goal is to provide a fast and flexible scene building and authoring platform to artists, TDs and production staff. We love USD and all it has to offer! I am passionate about large scale/complex project issues such as highly collaborative environments, robustness and scalability.
Video Player is loading.
Current Time 0:00
Duration 34:34
Loaded: 0.48%
Stream Type LIVE
Remaining Time 34:34
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

PRESENTER: Hi everyone, and welcome to this class where you will learn about maya-usd and how Animal Logic integrated USD inside Maya. The main takeaways that I hope you going to get from that class is, first to get a better understanding of the production challenges that USD helps to solve. Second, is to learn more about the great collaboration which took place between Autodesk and several studios to create a new Maya plugin, a USB plugin for MAYA called maya-usd. Third, is to understand the kind of workflows that can be put in place when you use Maya in combination with USD, by looking at some examples of the workflows we've put in place at Animal Logic. And to get a preview of what's coming next inside Maya USD.

And let's start with some production challenges. So typically a production pipeline will look a bit like this. So it's a very structured process which is centered around several departments. So you've got the modeling department, which is delivering 3D meshes, reading department which deliver controllers, skeletons, deformers, the surfacing department, which deliver materials.

And then we switch over to [INAUDIBLE] based departments, with performance departments, including layout [INAUDIBLE] animation which are responsible really to bringing animated character to life. You then have the lighting department, so these are the director of photography for the moving setting of the atmosphere and lights and also configuring the 3D renderer. Then you've got the effects department, who are the magicians creating distractions, explosions, fire, water. And a final department is the DI. Digital intern media is responsible for the final color correction.

An important aspect of the pipeline is the assets lifecycle. Each of these department follows more or less the same pattern when a scene is built using incoming assets and some work is done, published, reviewed, and depending on the outcome of that review, some more iterations will be needed. And once the work is finally approved, it will be delivered to downstream department.

To give some context about Animal Logic. It's been around for 30 years. It's celebrating its birthday this year, and so over all this time it's been involved in various types of projects, including commercials, hybrid feature film, and full CG feature films, like The LEGO Movie franchise, Happy Feet and Peter Rabbit 1 and 2. It spreads across two studios in Sydney and Vancouver and employed over 700 people, including artists, technician, and support staff.

An important aspect of the work that Animal Logic currently doing it is that it's able to handle several full feature film in parallel, which means that we have to solve some challenges by working at scale. So I've listed here three main challenges that we want to solve. The first one is the parallelisation. Which means essentially, to try and have each department to start as early as possible, and to be able to decrease production times and increase the throughput, like the general throughput of the project. The other one is the efficiency. So that's intradepartment, where you want to give artists more chance to iterate as many times as they need, and this will of course, increase the quality of their work. And finally, there is the reusability aspect for the assets which will help to decrease costs.

So the aim, of course, is to move from a more waterfall approach for the pipeline to the goal is to have something which is more agile and more nimble in the way that you will be able to create more connections between department. And to add more fluidity, which will help to bring more creativity in the production process.

So now let's talk about USD. So USD stands for universal scene description. It's been made public by Pixar in 2016. And so why Universal first? I guess that was a statement from Pixar, hoping that this software will become a standard. And I think that's what's happening actually. And so I've listed a few production challenges. But we also have technical challenges in terms of pipeline engineering, and in particular, like a pipeline from a technological perspective is constantly evolving. We've got new software, the proprietary software stack is also evolving, and you also want to try emerging technologies. So having the same description decoupled from DCCs is one way of bringing faster integration and giving more room for innovation.

So the other part of the name, scene description, was also I think carefully chosen because USD is much, much more than a simple file format, like alembic for instance. And in the sense that it's meant also to represent whole scene and represent the assembly of several nested assets.

So I've listed here, three important features the USD brings in our pipeline. First, is to represent in a layout fashion, the composition of several contribution coming from different department. So it will help with parallelisation and reusability. USD also provide very efficient in-memory representation of the scene graph and rendering framework, and that will be extremely helpful to tackle efficiency.

So to come back to the structural aspect of USD. I've listed here three important features, there are many, many, many, many more. First there is what's called composition arcs. We would normally talk about references, but in USD reference is a particular type of composition arc. So that's how you create your connections between high level representation of the asset and smaller components.

So again, you will be able to represent more parallelised and more reusable assets. Then comes the layers view and opinions. That's also a way to help with parallelisation Not only you are able to assemble a complementary part of the same object, but also you are able to bring several opinions of the same object, and depending on the order, one opinion will win over the other. And finally, there is the variants . mechanism, which allow for the same scene to provide several representations. So it's used at Animal to represent various level of details, to optimize scene.

So the other important aspect is the live offering aspect of USD. So it's centered around stage objects. And at the heart of this stage, you will find a composition engine, which will leverage a scene graph and provide the various representation using the USD API. So as we will see in the examples, you are able to interactively switch the various variants that have been offered. You are also able to do what's called edit target selection. We will also see that in the examples and to do any type of property offering. And finally, so that's a very, very brief overview comes Hydra rendering framework.

So Hydra itself is decoupled from USD. It ships with USD but is decoupled from it. And that's essentially a framework where you can connect scene representation with a renderer plugin. So you have to implement one delegate for your scene description and one for your renderer. So USD of course, provides a scene delegate for USD itself, which is called USD imaging and provide one renderer called HDStorm which is an [INAUDIBLE] renderer. But they are already a few other Hydra and their delegate available. One for instance, for Arnold, and the other one for the VP2 display view port in Maya.

So here is a brief timeline of the integration of USD at Animal. It wasn't all rainbows and unicorn. It took us several years. We took a more brownfield approach. So the early prototype started in 2015 with an internal file format that we decided to drop mid-2016 and to be replaced by USD itself. So that went through a project called Animation 2.0.

The goal was for us to switch from XSi to Maya, and we also took the opportunity to use USD as our scene description file format. Then the pipeline for projects started. So that's what we call the technical migration, where we wanted to keep the existing workflows but replace the underlying scene description. And Peter Rabbit 2, released in 2020, was the first show, the result of that first migration phase so it was fully run on USD. On the other hand, LEGO Movie 2 was the last movie released late 2019 not using USD at all, in late 2020, the technical migration was complete and we started to work on new workflows.

Now let's talk more about Maya and the integration of USD inside Maya. Maya is the main DCC at Animal Logic, so it's used for modeling, rigging, and all the performance departments. So when USD was released, it also shipped with a plugin for Maya called pxrUSDMaya. So it had like two important features. One which is static importer/exporter which is what you would normally think of when you think of a format like, alembic. And it also provided the ProxyShape custom node, which allows you to display a USD stage from within Maya so you keep that connection. And the design that Pixar plug-in was gearing to, was to have a workflow centered around Maya scene assemblies, and multi ProxyShape should support. Which wasn't exactly what we needed. So we decided to go ahead with our own plugin.

So to go back to these two important and different feature, because they're not really mutually exclusive. On one hand, you've got the static import/export, so that's the easiest way to integrate USD inside Maya, where you would simply replace your alembic files with USD files, so that's very not disruptive. But on the other hand, when you do that, when you import your USD object into Maya objects, this USD high level concepts are lost.

So we still use at Animal Logic, the static export to bake all caches, for instance. But for the rest we rely on the ProxyShape for any interactive manipulation from within Maya. So with the ProxyShape, what you get is that once it's imported inside Maya, all the USD high level concepts are preserved. And also of the batch, you get access to Hydra and its render delegate. But there is a price to pay for that, because now you are working with the custom nodes. And because also, USD represent a full scene you want to give the same sort of native experience. And that's where it needs to be a bridge between the two data models that are USD and Maya.

So our first attempt at Animal Logic to bridge these two data models was to use a combination of custom transformed nodes and also proprietary software to deal with stage edition. So these custom transform nodes would be used to manage selection and transformation. Whereas the front-facing widget will have the stage-authoring part. Another workflow that came a bit later, which is more hybrid, is to still use the ProxyShape, but then being able to push some USD prints inside Maya. They can then be edited and pushed back to USD. So the result of our first phase of development was AL_USDMaya version 0. So the development started in 2016 and it was used in production for Peter Rabbit 1, for example.

So there is a custom translator plugin framework which is completely different than the one found in pxrUSDMaya The render delegate had been slightly modified and as it mentioned, there was this custom transformed node to help with the interaction for the stages. In terms of results, it was slightly mixed. The integration in the animation department went pretty well. Regarding the layout department, the integration was OK.

But there was still a real need for full integration between USD objects and Maya native objects. The integration in the environment authoring department was the most distressful. Mainly because the changes were, we think too abrupt and also the platform turned out to be quite unstable. But we learned also from that, we needed to take a more cross-functional approach, bringing artists and developer closer together.

In 2017, we went ahead and open sourced AL-USDMaya, simply because USD itself was open source. Overall, the experience was fantastic. We got great external contribution. It helped us to improve the code quality. And also we improved a lot our infrastructure around AL_USDMaya testing, particularly by using Docker containers. What didn't go that well was our Git workflow to manage internal and external contributions. It turned out to be quite complicated and we decided to go with something probably easier to manage for our next iteration. Also, the community management side of things was clearly underestimated, in terms of overhead. And so we decided to allocate more resource in that area.

Now that AL_USDMaya was released, there were two plugins and that was, of course, a source of confusion. So in 2018, there was an agreement between Autodesk, Pixar, and Animal Logic to create an official open source plugin called maya-usd. It would be based on pxrUSDMaya to start with. Both legacy plugins source code would be migrated from the original repositories to Autodesk. A new core shared library called, libmayaUsd, will host new features. And Autodesk will implement dedicated VP2 render delegates, a UFE integration took care really of the heavy lifting for the migration.

So as I said, we had an Animal Logic, that first attempt at bridging data models. And the answer to that problem-- Autodesk answer to that problem is the UFE which stands for Universal Front End. So it's actually DCC agnostic, it's not coupled to Maya. And that's a system that allows to connect an underlying scene description with a DCC engine. So in our case, it's about bridging the gap between USD and Maya, and that will enable the synchronization between hierarchy, display, selection, management, transformation, even Undo management. And because that's an interactive platform, there is the need for a kind of notification system. And so that's the observer design pattern that has been applied to UFE.

Migrating these two legacy plugins into one unified plugin wasn't a simple task, because it was involving various partners with different needs. And I think overall, that was a good example of a really great collaboration. So a Technical Steering Committee was put in place to manage this transition. There were regular meetings where in which priorities were discussed together. And for a larger initiative like, for instance, how to represent the USD transform stack inside Maya, there was a dedicated process with white papers lots of back and forth and draft PRs.

I also wanted to give a shout-out to the customer charter initiative, which I think is a fantastic initiative to engage the community when a project is still nascent. From our perspective, the migration was extremely successful. We really appreciate all the open mindedness taking into account all various partners' opinions. Also the extreme transparency in the process, and it's iterative fashion which really helped us to not disrupt too much, our productions, by migrating to the new repository. Of course by handing over the project to Autodesk, we lost a bit of velocity in our interaction with our users because now there is a loop involving other partners, but overall it was much than worthwhile.

So the result of that migration was AL_USDMaya V1, which is currently used in production. So the translator plugins hasn't changed much. The render delegate has been switched to the VP2 render delegates, implemented by Autodesk. And we also leveraging UFE in our interaction with USD inside Maya. We are still looking at solving other challenges involving USD, and have named a few here, which is proceduralim, having ultra-high-quality real time renderer. And the cloud migration is also a really big topic at Animal Logic.

And now, let's take a look at some examples with images and videos. And to start with, I wanted to introduce the ALab, which is an asset that Animal Logic recently published. So I've given here a few links. I don't know Logic intention was really to share how we represent our asset. And to give a place where people can experiment with things which have a more real life scale, because that's an important aspect when you want to experiment things, compared with using a much simpler object.

So the first example is a push/pull example, that I mentioned earlier. So this workflow will be available in maya-usd. But for now, at Animal Logic, it's implemented using a combination of a tool called Environment Studio 2 and a few AL_USDMaya commands and some Python codes. So what you can see here on that image is-- so that's Environment Studio, which displays the content of the stage and that object here has been pushed to Maya, so that's displayed here with a different icon. And we can see here that the Maya native mesh that has been imported. The artist can then go ahead, edit it natively in Maya, and then push it back using that menu.

A second workflow that I wanted to show today is the set dressing workflow. So it involves the same proprietary tool called Environment Studio at Animal Logic. And an artist would want to move things around and to hide things, but also to optimize its scenes, which is a different set of changes. I'll explain that a bit later. And an important aspect of that demo is also the edit target management, which I mentioned earlier, and that's a feature that will be available in maya-usd.

So what we will see in that demo is a manipulation of various properties of the object in the scene. So this is Environment Studio 2 that we can see here, displaying the hierarchy content of the scene with some asset types that are listed here on the column. And on the bottom, we can see that the selection is synchronized with the view port, which means that when an object is selected, it is synchronized.

Here is the view of the current edit target. And it's empty, and once a transformation is applied to it, the change will be recorded, to that layer. And what's going to happen next is that another type of change will be applied to the change. And based on some Animal Logic rule, we want these changes to apply to a different edit target. In that case, it's about switching the display mode of an object to optimize the scene. And for us, we want to do this for these changes to be stored in what we call a volatile edit target. Which means that it's not persistent, that's just for the artist to have a better interaction with the scene.

So two different types of display modes are applied to two different objects, and we will see now that, the changes will be recorded in a different edit target. Same thing here for that object, it will be hidden. And there is the possibility, there is this difference in USD between hiding something and disabling something. Hiding for us is recorded in the volatile layer, whereas disabling is recorded in the persistent.

So we can see that here in the persistent layer, only the transformation has been recorded. Now we're looking for, the other edit target in which we will find the other sets of changes. So first there will be the switch to invisible for the first asset, and then we will find the two display modes, one which is for the simple bounding box, and the other one being for the shaded bounding box.

So the last modification will be to switch the asset and to disable it completely. And this type of change is meant to be recorded in our persistent domain layer. So Environment Studio will go ahead, switch the edit target and record that change in that layer. So we can see that the object has disappeared and once we reload the edit target, now that we can see that the primitive has been disabled. So that's for the set dressing workflow.

Here is a very brief example of the same, the ALab imported inside maya-usd in Maya 2022. And it displays so that this native integration provided by UFE and native transformation with AnDue support. So we can see also the VP2 render delegates, displaying the textures, transparency, you name it. So it's of course, possible to manage hiding objects. There is also a variant display and bar selection support. And as I said the edit target management will soon be implemented in maya-usd.

Another workflow, which is extremely important at Animal Logic is the rig to animated cache variant switch. So it's used by animators to optimize their scene. Essentially they normally work with several characters, and they want to keep their interaction as fast as possible. So we represent the animated characters with at least two variants. One is the rigged variant and the other one holds the baked cache variants.

The cache is much faster to display, but of course can't be animated whereas the rig is often times heavier, but that's where the animation takes place. Another point is the fact that we export our animation caches, as sparse point position instead of baking the whole mesh. And to apply this point position to the static position, we use the USD opinion mechanism that I described earlier. And that allows us to save some space.

The rig are imported using a custom Maya reference type. So the Maya reference mechanism will also be added to maya-usd. And so in our case, once these Maya reference is imported, we've got a set of Python translators kicking in and building the rest of the ecosystem, particularly importing proprietary deformers, the GPU deformers. Then the front-facing called Forge 2 will go ahead and apply some variant selection, depending on the artist's request.

Here is a view on how a shot looked like at Animal Logic. So that's pretty much what you would find on a production shot. So that's the list of all the contribution from any departments. And in particular, there is the contribution coming from the animation department, which will bring in the animation cache. And in the assembly domain layer, that's where you will find buried deep down, the global assets with the static representation.

And so in these two images, you can see a scene which has been built using Forge 2 in Maya, where the character has been loaded and switched to its rig representation. So it's displayed here in the hierarchy. We can see here by looking at the nodes that Maya reference has been imported, and that other nodes have been added to the scene to manage the animation of that character. Once the animation is complete, the artist can bake that animation to a cache and switch using the interface from the rig variant to the cache variant.

And this concludes my presentation today. And I hope you learned more about the kind of workflow you can enable by using Maya in combination with a software like USD. Thank you for attending this class. If you have any questions, you can reach out to me directly or using our ALab forum. Thank you.