Description
Key Learnings
- Learn about the Media and Entertainment Autodesk Platform Services API strategy.
- Learn about the new Flow Graph Engine API.
- See how the Media and Entertainment Data Model will bring data more easily to customer workflows through the API.
Speaker
- Kevin VandecarKevin Vandecar is a Developer Advocate Engineer and also the manager for the Autodesk Platform Services Media & Entertainment and Manufacturing Workgroups. His specialty is 3ds Max software customization and programming areas, including the Autodesk Platform Services Design Automation for 3ds Max service. Most recently he has been working with the Autodesk Platform Services Data initiatives, including Data Exchange API and Fusion Data API.
KEVIN VANDECAR: So hello, everyone. My name is Kevin Vandecar. I'm part of the Autodesk Developer Advocacy and Support group. And today we're going to talk about how media and entertainment is joining up with the Autodesk Platform Services. So we'll start with an introduction to the Platform Services and we'll talk about the new Flow services as well.
So I want to start with the Safe Harbor Statement. We've seen this before and the gist of this is just to make sure you don't make any purchasing decisions based on software that's in beta or may not be released yet. So the agenda for today is we're going to talk about Autodesk Platform Services and give an overview, just so everyone is aware how these Flow APIs will be available. We'll talk about the M&E Flow strategy overall, and then we'll show the Flow Graph Engine Service and the Flow M&E data model vision.
So we'll start with what is APS. And the very first thing I want to point out is that we had the Platform Services branded as Autodesk Forge in the past. So whenever you see Autodesk Forge, there's some history on the internet and so forth, just always equate that with the current terminology, which is Autodesk Platform Services. Autodesk basically made the decision that because this is really going to be the core of not only our cloud presence, but how data gets authored from our desktop software, just wanted to really point out that this is the platform of the design and make industry.
So to start, APS has really enabled this digital transformation for a lot of our customers. It connects the design and make platform to the business processes. And here's just a few examples. Office is a great one, where people have connected with things like Power BI to extract data into Excel and do rudimentary reports and things like that with the data that's coming from the platform.
And then, of course, all of these other activities are really enabled by the platform. So ERP, asset management, which is going to be really big for the Flow strategy. Collaboration is also a really big one. And then CRM, even. Customer management is even better when you have the data from things that you are designing and making.
So Autodesk Platform Services has been around for a number of years. And like I said, it was called Autodesk Forge in the past. But all of the underlying APIs are the same and are very mature.
The main use case here is connecting data, workflows, and people. And in this example, it's showing the Model Derivative service, which is taking all kinds of different data inputs and processing them through the Platform Services and bringing really unique workflows. Viewing, planning, reviewing the things, asset management, maintenance, purchasing, all of those activities can be derived from this data coming from the platform.
So to take a look at Platform Services from an API perspective, on the left, the core APIs basically Data Management, Data visualization, that's what we also call the APS Viewer. Sorry, that's an add-in for Revit.
We also have the Viewer API. We also refer to that as the APS Viewer. Sometimes, you'll still hear it called the Forge Viewer. And then the Webhooks API, which allows you to get notifications from a platform when tasks are finished and things like that.
In the middle, you'll see the Data APIs. So we already have the manufacturing data model and the AEC data model in a production state. So AEC Data Model has even gone to general availability. And then we also have Model Derivative API, which has been around for a number of years, and that's our translation service.
If you're familiar with a feature called Shared Views within some of our desktop products, that's the Model Derivative API running underneath it. And a great example of where our desktop products are using our platform directly from desktop instances to produce things that can be shared and collaborated on in the cloud. And then finally, I want to mention the data Exchange API. So Data Exchange is another data model that is allowing interop between products and data models. So that one is in beta as well at this point.
And then finally on the right, cloud product APIs. So the most mature one that we've had there for a while is the Design Automation API. And this is basically running some of our desktop software in the cloud, allowing you to automate common tasks and freeing up desktop resources to be able to continue using your licenses in a desktop way while you batch process, and extract data and things that you need from the cloud instance. You let the compute take place in the cloud where it can be more efficient and more affordable, even.
And this is where the new Flow Graph Engine API will fall. It's basically also a compute service that's very similar to the design automation API in that it allows you to batch process Bifrost graphs in the cloud. And then we also have some AEC product APIs. Forma API is relatively new. It's from the new Forma product.
Tandem API is the same. It's for digital twins in the AEC space. And then we have the AEC and also the older BIM 360 APIs. And just want to point out that all of these APIs are industry standard, well-formed APIs. Most of them are REST APIs.
But as we move into data more and more, those data models are using GraphQL, which is even a better way to get access to granular data. So with a REST API, typically you would get back the entire chunk of data and it can be very large from products like Revit, or even in the M&E space, products like 3ds Max or Maya.
So with GraphQL, it allows you to drill down and query and ask for just the data that you want. And then, of course, all these other technologies. Of course, the frontend development, typical HTML, things like that. And then all the common programming environments are also supported. And because we're using REST and GraphQL, even programming environments that are not listed are easily supported, as long as they can support these environments.
So to take a quick look at how Platform Services fits, of course, you have your front-end experience with HTML, CSS, and probably JavaScript. And then you can start adding in all these other components. So MongoDB is a common one for database storage. Our viewer is based on WebGL and Three.js. And so we're consuming some of these common technologies in our use as well.
And storage is another big one. So Dropbox, Google Drive, so forth. You can basically use their APIs to bring that data into the Autodesk Platform Services as you need and vice versa, as needed. And then, of course, all the big players in cloud services are completely supported and can be used.
And then finally, some of these other side services like Microsoft SharePoint, for example, or IBM Watson, and even the Autodesk Construction Cloud, these are supplemental services that all just work well with building complete solutions. And there's all kinds of other things that are not even listed here. So we just want to point out that Autodesk Platform Services is just another web component that you can consume to build these really robust, cloud-based experiences.
So to give you a quick look at our vision for Platform Services, you'll see in this diagram that Autodesk Platform Services does sit at the bottom. And everything we do for cloud-based workflows are built on top of this. So the Autodesk data model is a great example. It's using an asset graph-style database. So that's a programming term. Don't confuse that with asset management.
But that graph style of database to store this data at the low level is a really powerful way of having high performance access to your data in a granular fashion, while also maintaining versions, and relationships, and things that are very important to the data itself. And also, that concept then becomes your source of truth. And then on top of that is where our industry clouds are built.
So Fusion and Forma are already building themselves out and are pretty mature already. And Autodesk Flow, for the M&E industry, is coming online now. So that's the point of this presentation, is to bring you up to date on what brings to the Flow industry group.
So let's talk just a little bit about what customers and partners build with APS. So the most common one is dashboards and reports. Basically, getting access to your data and understanding what that data means.
So for example, maybe looking at how many versions of an asset you have, looking at how many instances of that asset is used in actual scenes, or shots, and things like that. So once we have this data in place in a single location, it's managed by the Autodesk Platform Services and it makes it much easier for you as the consumer and producer of that data to get back statistical and other great information.
On the AEC and manufacturing side, digital twins have become very important. So that's another big use case. ERP, CRM we mentioned earlier. That's another big use case.
And configurators and design automation is another great one. And I think this is going to be an important one in the Flow industry group. And just talking about Flow Graph Engine as an example, there's no reason you couldn't have a standard Bifrost effect and have that configurable by your customer, or your user, or your employee to produce multiple outputs to simulate something in a specific context.
So this all results in building things on your side that would include data connectors to get access to the data, maybe asset catalogs so that you could review your assets very easily, either within your own studio, or maybe you're a marketplace and you want to sell your asset. So we've seen this already done in the manufacturing space, for example.
And then, of course, review is another big use case. Things like retopology, Bifrost simulations, those are all important aspects where you want to experiment, but also review the results before you actually put those assets into production.
So we do have very strong support for your APS journey. These are very open APIs, like I mentioned, REST and GraphQL. We also have a marketplace. So if you are a commercial developer and you need to have-- you need to market your tools, we have a marketplace that allows you to even market your cloud-based workflows.
And then we have a very strong onboarding and support program and that's what my team leads. So developer advocacy and support. We run activities such as bootcamps, accelerators where you can come with an idea and spend a week with our experts to build out your proof of concept, for example, developer conferences, and, of course, things like Autodesk University and SIGGRAPH. We participate in all those types of activities.
So I want to mention before we dive directly into some of the new Flow stuff that the M&E has had presence in the APS services already. So these have been in general availability production for a number of years and it mostly centers around 3ds Max. And it kind of makes sense, because Max has always been the product that is kind of part of the CAD visualization realm.
And so it makes sense that 3ds Max is already there, but we're going to see more and more M&E tools coming into the mix. So just as an example, 3DS Max Model Derivative is supported. We have other formats, FBX, OBJ, glTF that also are supported in Model Derivative, and then Design Automation is also supported for 3ds Max. And I'm going to show you examples of both of those, Model Derivative and Design Automation.
So we'll start with the Model Derivative workflow. In this example, I want to show the Model Derivative service that is part of Autodesk Platform Services, or APS, and how it can handle M&E-style scenes. So in this particular case, bringing in a 3ds Max native file, so the .max format.
And it has the physically-based rendering materials, so PBR materials. And basically, when PBR materials are present in the scene and you translate your scene, those materials will be loaded by a standard surface extension that is part of the APS viewer.
So it gives you the ability to have pretty decent materials without having to use custom shaders and things like that to get previews, and maybe e-commerce type scenes that are full 3D in behavior that come straight from products like 3ds Max.
Model Derivative also supports FBX and so forth. And the APS viewer is actually running here and I'm actually doing all of this within Visual Studio Code. We have an extension, which I mentioned in the tools. And basically, the extension allows you to do some of the base APS activities directly in Visual Studio Code, which is a great environment for testing and kind of understanding the behaviors of the APS APIs, while getting real-time feedback and working directly with the services.
So the viewer itself is a full 3D viewer, has all the standard viewing capabilities, has a model browser, which helps you to navigate through the scene, isolate things. You can also turn things off, show all objects, turn one or two things off as you want. And so you have quite a bit of capability here, right out of the box.
Standard navigation tools as well. You can always go back to the home scene. And pretty decent online viewing capabilities. Let me just show you one more example of that before we move on.
So this is another scene from 3DS Max using physically-based materials. And in this example, it's this astronaut guy. And one of the cool things here is, again, the ability to look at different components. So I could hide the geometry, and-- oops. I'm sorry.
And just look at the biped, for example. So if you wanted to do some review and see if this scene was set up and had the proper biped, you could, of course, view it this way. Or we could do something opposite and just select it. Or we could basically just highlight the different parts that we want to look at. So again, navigation is pretty decent and shows quite a bit of capability here.
In this example, I want-- OK. So the next example is using Design Automation for 3ds Max and just want to show some of the capabilities there. In this short video, I'm going to show how 3ds Max Design Automation can work in a web app.
Basically, 3ds Max is running in the design automation tech stack and that means you can pretty much automate anything in the cloud that you might normally do on the desktop. And so you can see that this is a custom app. It's been deployed to a certain location using Heroku in this example, and it has some of the parameters that you might see in Pro Optimizer, but parameters are not absolutely necessary.
So basically, we gather up these inputs. Could come from wedging, or other types of activities. But in this case, it's got a user interface. There's three different values that are going to be used, which will optimize in three different outputs. And then we can start the work item with this.
So basically what's going to happen when we start the work item is it will send all of these inputs, including the Max scene that has the mesh in it to the cloud and automate the Pro Optimizer against it. You see this work item started and finished. And then we're actually using the APS Viewer as a way to preview the results.
So the top here is the original model and its basic output. And then you can see in that dropdown, we're looking at different versions of the series of values. So the 25%, the 35%, and the 65%. And we can toggle on the wireframe mode to get a little bit better idea of what this looks like.
So you'll see it toggled on here. And there's the resulting mesh with the original, again, on the top. So it just shows one example of what you could automate in the cloud. And just wanted to make sure we highlighted that 3ds Max is part of Design Automation and has been there for a number of years. People are using it for various things.
OK, so let's now move on to the M&E Flow strategy. So we looked at what the Autodesk Platform Services offers in general across the whole design and make space, and the M&E Flow strategy is coming into the mix now more formally. And really, the point of the Flow strategy is to provide some of these high-level goals.
So one of the big ones is to help orchestrate open standards. That's a big one. We'll talk about that in a minute. Better collaboration, tackling complexity and asset management while delivering flexibility, unlocking data and insights. And that's what we talked about before with the concept of dashboards and being able to get insight into your data as well as just accessing your data.
And of course, everything will follow the APS guidance, in terms of API first. REST APIs will be used on the compute side and GraphQL will be used on the data side. And additionally, because M&E workflows typically include Python, a lot of our desktop softwares, of course, support Python. There will be Python SDKs to help with pipeline developers and so forth. And the tagline has become unbridled creativity through this effort.
So from a standards perspective, just want to point out that these are the standards that we're working with. MovieLabs, we work with directly. VFX platform is something we've supported and followed for many years. Tools like Maya and 3ds Max support the tools that are suggested for VFX platform.
And then the Academy Software Foundation is also another one. So we want to keep these standards in mind so that the out-of-box experience is as close to industry standards as possible, but also giving you flexibility to alternate to the demands of your workflows and pipelines.
So to start, I want to mention some rebranding. And so anyone using these products, of course, already know. But just to understand in the bigger picture, the Autodesk media and entertainment Hero products remain the same. So products like 3ds Max, Maya, those are not going to be rebranded.
They will be integral to the workflow strategy, though. So don't confuse them being separated out, because those are our key portfolio products and they will be part of this Flow strategy. Some of the cloud-connected products that were cloud connected already will be moving to the Flow terminology.
So Flow Production Tracking, which was formerly ShotGrid, has been rebranded already earlier this year, and only the name changes. The functionality remains the same. And as we move forward, you'll see that ShotGrid will start interacting with the Flow Data Model at some point in the future as well.
And the same goes for Flow Capture, which was formerly Moxion. Both of these tools already have cloud connectivity. And so the industry group just felt that putting Flow terminology into these names was really important to show the strategy behind the industry group. There is a link here that will be also in the handout where you can sign up to stay connected to the Flow strategy as well, and you'll get newsletters and be able to see how things develop.
So let's take a look at the typical environment from a programming perspective and a conceptual perspective of where the M&E workflows come from. So there's always data somewhere. And so the vision is that Autodesk Platform Services will bring and allow this authoring of data and access to this data. And then at the API level, all of these different operations will be provided through APIs. And those APIs will be reused by any of our products. So up here, you'll see these products.
So that really supports the concept of API first. Autodesk tools are going to be using the same APIs that will be made available to you as a developer and a customer. So the vision is really clean and good. And to just point out where we're starting, compute and services is one big area, and that's where the Flow Graph Engine API comes into play. And then Asset Management is another one, and that's where the data model is really important.
So the two services that we're going to talk about today that are on the horizon, Autodesk Flow Graph Engine Service, currently in public beta, and Autodesk Flow Media and Entertainment Data Model. And we'll talk about the vision there, because we don't have any customer access to it yet. So we'll start with Flow Graph Engine Service.
The primary goal for this service is to provide a means for Flow customers to use compute services by offering M&E compute software as a cloud-enabled capability. So what does that mean? What it means is basically you, as a customer, will be using these services, whether you know it or not, within our products. But because they're also API-based, those services can be automated as well.
So for example, in the context of Flow Graph Engine, they are Bifrost graphs that you're executing. And so you'll be able to execute them directly from within the Maya environment, for example. But what if you want to batch process a bunch of those graphs outside of Maya and not tie up your Maya licenses for that activity? You can use the API to execute those graphs directly in the cloud and get your results back and free up your desktop resources.
So some use cases here are building bespoke compute workflows by external customers and also internal projects. And that's important to keep in mind that Autodesk is using the same APIs that we're making available to the public. Accessing VFX compute software as cloud-enabled services to integrate into studio pipelines.
The current capabilities and benefit, you can currently execute the Bifrost graphs. And we're looking at other operations to execute. Retopology is another tool that's already using the Graph Service from a customer product workflow perspective, and we expect that to be made available at some time in the future as well.
And this allows users to access compute instances running VFX compute software as cloud-enabled capability. We are looking for input. So if you have other operations that you think would make sense here, you can use that feedback link for the beta and provide us with input.
So let's start with what deoes Bifrost provide. And a lot of even my customers are not really aware of what Bifrost is. So it allows you to create stunning procedural effects. It offers a visual programming environment, so it's a graphing environment. So drag and drop nodes to connect different behaviors to your scene and your geometry.
The output is realistic simulations and effects. The example does a scatter object procedurally. So you basically can say, for example, on this plane that represents some terrain, scatter some trees, and that's what the example does across, and make sure those trees are always on the plane and in a random pattern. And it also supports creating and editing of USD scenes.
Now Bifrost is for Maya, and this is what it allows within the Maya context. But there's also a version of Maya Bifrost that's standalone, which has a full desktop SDK and it can be used to programmatically create graphs. And then you could, in theory, take those graphs that you create programmatically with the SDK and send them for Flow Graph Engine evaluation as well. So there's lots of possibilities here.
So how does it work? So first of all, just know that Bifrost can create a massive amount of data and it can take a lot of time. And that's where the benefit of the cloud comes into play, because the efficiency of creating that data in the cloud is contained in a virtual machine environment and it can also take a lot of time.
So if you were to run this on a resource on a desktop machine, it's going to consume that desktop resources. Even if you don't have an artist sitting in front of it, that machine is sort of dedicated to executing this. So it offloads that compute to the cloud.
It is a compute service, very similar to the Platform Services Design Automation. It uses REST to automate the evaluation of the Bifrost graphs and allows you to integrate this functionality directly into your workflows. So it could be something that you batch process in a pipeline. Maybe you run your graphs at night when no one's around and you have the results the next day.
There's plenty of things that would make this really an efficient way of executing Bifrost graphs. And the basic steps are you gather your inputs from the Maya Bifrost instance. You'll submit those inputs to the Flow Graph Engine Service to create a job.
And then your program will use a polling technique to check the status. So it's going to get scheduled. It'll get queued, and then you'll get a succeeded response or maybe a failed response. And when the job is finished, you'll download and interpret the results.
So let's take a look at a practical example. So we're going to take the input geometry, in this case is a USD, and this is the terrain, which is basically a plane from Maya. And it's in the USD format. And we're going to also take the Bifrost graph.
So you'll publish the graph out of Maya. It'll be in a JSON format. And those become the inputs. The Flow Graph Engine Service will run the compute and then the output will be the effect. And if you notice here, the output is different from the input. And so that's also a data efficiency.
You don't necessarily need to maintain this if all you want is your results. And this is just an example of how it's viewed in the scene. So let's take a look at a sample program that does this outside of Maya.
For this Flow Graph Engine API demo, I want to start in the documentation and just show that we are currently in beta. Full general availability will come soon. And this documentation is quite complete. It talks about how you can submit a job, and showing you code examples, and so forth.
So the field guide is an important one to read. It contains all the different steps that you would need to understand overall. And then I would also suggest running through the how-to guide, which is basically a tutorial. So this is the tutorial that basically is connected to our samples as well. And you'll find our samples here. We currently have a JavaScript sample and a Python sample, and we're working on a new one, which I'm going to show in a minute, that has a user interface, which makes it a little bit easier to understand and see.
The other thing, as mentioned before, is that it does run Bifrost graphs. So it's helpful to know about Bifrost and the fact that Bifrost is an integral component of Maya. So currently, using Maya, you can use Bifrost within Maya, create these graphs, and have these graphs automated in the cloud by running them through an automation pipeline using the API.
And finally, I just want to point out that I am going to be running a new sample in this demo. It's currently in an individual's account, but we will be soon moving this to the Autodesk Platform Services account. And you should be able to access it directly there.
So this is a Node.js sample. And I've got my Node.js command prompt and a copy of the source code from this GitHub repo. So to start, you would do npm install. And this has already been done, but just to show that that's the technique.
And in a default state, it's going to download all those npm packages, of course. And then I'm going to do npm start and that's going to basically execute this sample. So I can go to localhost now. And again, this is a sample, and you can see it running here.
And one of the things that you'll need to do is post your own client ID and client secret. These are obtained through the Autodesk Platform Services app creation, and you'll need to make sure that you have the Flow Graph Engine provisioned.
So I am going to paste my client ID and my client secret. And normally, these would be part of your app and not visible to the customer. But this is a sample and we want to show the required pieces to get the sample to run and how it connects back to your account in the APS services.
So with that client ID and secret, I can log in and I can create a new job. And so this is a job-based service. So I'm going to create a name for it and we'll just call it fred01. And I need to create-- or, I'm sorry, I need to upload the Bifrost graph.
So this is a JSON. And what I'm going to do is I'm going to browse to my local file system. And I basically created a data folder here, just so I could find that input and other data quickly. So I'm going to open that. And then this particular sample also requires a USD file, which contains the geometry.
And then I'll go ahead and create this job. And what it's doing is showing you the different steps of how this is operating. So there was the upload step. As soon as it gets queued, it gets a job-- unique ID. It's going to get queued and then scheduled to run in the service.
And this is how you can stack things up and batch process them, basically, to execute any number of graphs. You can run them in parallel and the sample will kind of allow you to test that and see how that works. So we're going to let this run. It should only take a few seconds, because it's a pretty simple graph.
You'll see that the job succeeded. And what I need to do now is download the output. So I'm just going to download it. And I've run this before, so I'm going to overwrite the one that I have there before. Yes. OK.
And I do have 3ds Max running here in the background. And I previously imported the import plane, the one that went with the Bifrost graph, to basically execute against. And now I can import the results of the graph, and that was that output1.usd.
And again, this was a scatter graph. So what we're going to see is all the trees that were computed to be placed on that plane. So pretty cool, easy to execute, and you can batch process all you want with this API.
So we have quite a few resources available for the Flow Graph Engine API. And remember, it is in beta at the moment, but it will be coming to full production general availability soon. All of these links, I'll provide in the handout. So you'll be able to get those as a download.
And just keep in mind that while we're in beta, the service is free. So it's available to you to test and work with to see how it might work best in your workflows. While it's in beta, it does have a few limits set up. So the CPU and memory configurations are fixed. But once we go to a paid model, then you'll be able to configure those as you want to best service your Bifrost graph.
So if you have a very complex graph, you probably want more CPU and more memory components for the job execution. And it will be consumption and token-based. So using the APS typical token model is how you will pay for it through the APS platform. And we're still looking for feedback. So if you're interested in trying this out, the feedback link will take you into the project and provide the input as you want.
So that leads us to the next topic, which is the Flow M&E Data Model. So the data model itself is the underlying software architecture piece. And what it's going to be doing is enabling Asset Management Services for many use cases. And the primary goal is to provide a means for Flow customers to collaborate on assets and data in a centralized fashion without worrying about the data stability and duplication.
So the way this is going to look, again, if we go back to the industry clouds, our core portfolio of products are going to be authoring that data in the most part. Now it doesn't mean as a developer, you can't also write data, and especially metadata is going to be important for the M&E workflows. But let's take a look at how this is unfolding from the Autodesk corporate perspective.
Fusion was the first data model to go into general availability, and it's basically using the Fusion software to write the data model during the authoring of a scene. So it's already got that vision of eliminating the concept of files, which we'll talk about in a minute. Now Forma is the AC data model and it's primarily working around the Revit software. And it's very-- the schema there is very much tied to the Revit software and the data layout from Revit.
And that's important to note, because the AC industry is very much tied to BIM data and how Revit handles data, as well as formats such as IFC. And so they're chasing standards that have already been pretty well established for a while. And then when we look at Flow, this is where there's going to be a lot of capability on the customer side to create the schemas and the data structures that are most important to you as the customer.
And of course, once the data is there, it can be interacted with. So customers, partners, third parties, pretty much anyone in the pipeline that's interested in data and can do something with that data is going to be a consumer and potentially even an author of that data.
So fundamentally, these ideas of putting these data models in the cloud is to move away from files and move more towards data. So today, we have files and folders on disk, and that's a big problem. Where are the files? Which server are they on? Which cloud service are we using today?
Whereas in the future, we want you to be more worried about managing the assets in a project. That's all you really care about. It shouldn't matter about where the files are on disk and which version of the files, and so forth. You have to deal with naming conventions and file paths. And we want you to deal with data, with structured relationships and versions.
Scripts and tribal knowledge are very common in the M&E space and we want to give you more software-defined workflows. So rather than having scripts do various things, we want it to be well-defined software that either you write and of course, Autodesk provides, that that completes your workflow strategy. Data silos and special files are also something we see often with customer-customized workflows.
And we want that to really become a centralized information model. And so this is the grand vision for all of the Autodesk data models and we're taking baby steps to get there. Files are not going to go away overnight. But as you start thinking about data, we want you to think about it as data and not as files.
So how would this look in the Flow Data Model? So with Flow, it's all about assets, and versioning of those assets, and metadata against those assets. So for example, and this is not fixed, and this could be customized, you could have an ID of some sort, the name of the asset, of course, the immutable versioning.
So any time the asset changes for any reason, a version is created. And you can always roll back to a version. You can look at a version that might be used in prior scenes and whether you want to use that version or maybe an updated version. And of course, the history. Where have these assets been used, and so forth.
And an asset will be made up of components. And this is where we drill down into a more granular type of data. So it could be binary data, pipeline data, studio data, things like that. Maybe the studio location that produced it or the division that produced it, things that are important to you as a customer.
And within that, also there will be things like types of data. So in this example, binary data. A 3D model is a perfect example of that. Movie files, image frames, textures. And, of course, app-specific binaries. We're looking at ways to even connect this outside of the Autodesk ecosystem, of course. So Houdini is being considered to connect to this as well.
And then on top of an asset and its components, you can have relationships. So now you can have a new asset actually derived from a prior asset, but be a new asset on its own. So you can manage the original asset and the base asset by itself. And the derivation automatically picks up those things through the relationship.
Dependencies. The asset requires this set of textures and it's dependent on that. So there's all kinds of ways that this asset concept can apply to your specific pipeline.
And with this format, it's using components to really give the assets personality. So when we look at different types of assets, it could be 3D model, it could be a take, it could be a user shot. And the typing is going to be inherited by the schema.
So the schema itself can be inherited. So that's the point here. And customer-defined components are really important as well.
So let's take a look at a couple of workflow examples. So Flow Production Tracking, of course, what you need to manage and keep track of is tasks. And so this is what it might look like within the Flow Production Tracking interface.
Maya and other DCCs, I mentioned Houdini. Typically, there, you're going to be dealing with character assets and this might be what the interface would look like for that tool. And then finally, Flow Capture, which is managing sequences, and takes, and dailies, basically. And so this might be what it looks like there.
So very similar user interface, but very specific to the tasks at hand and the workflows at hand. So how is this going to work? Again, the DCCs will be authoring the data. And unlike other Autodesk data models, the M&E data model will provide a customizable schema.
We will have a default out-of-box schema based on those industry standards that we talked about before. So we're going to try and make the initial schema as close to those standards as possible so that out of the box, you don't have to do any extra work, unless you need some custom behavior. Connections currently being considered are for multiple DCC products.
We saw Maya, Flow capture, Flow Production Tracking. I also mentioned Houdini is in the works as well. And from a developer perspective, it is going to be GraphQL, which allows industry standard access for developers and is supported by all the Autodesk industry data models. And then specific to M&E will likely be a Python SDK that will allow you to integrate this into your existing tools and pipelines as well.
So just remember that the M&E data model, this is a vision at this point, but other Autodesk data models are already in production state. Some of them are still in beta, but they are running in our production services. So you can access all three of these other data models, AC data model, manufacturing data model, and Data Exchange today to get a sense of how these data models are going to work.
So in summary, just want to follow up with that Autodesk Platform Services provides industry standard APIs and SDKs supporting a variety of workflows. So don't think that you only need Flow Services, for example. Other services in the stack may be perfectly viable for whatever workflow you need. The APS Viewer is a great example of that.
And APS does already serve M&E customers today. So this is not an entirely new concept, but it was sort of limited to 3ds max and certain file types. So this is bringing a greater presence of M&E into the stack.
Flow Graph Engine API is available today. And the M&E data model and other services are coming soon. And we'll provide these resources in the handout.
So look for that. And you'll be able to access everything that I showed here after the event. Thank you for your time and hope this was valuable.