Description
Key Learnings
- Learn about 3ds Max and Omniverse workflow for visualization
- Learn about using the PBR workflow for material creation
- Learn about Omniverse Create and Omniverse View interface and features
- Discover best practices for real-time visualization utilizing RTX ray tracing and path tracing for final output
Speaker
- Ernesto PachecoErnesto Pacheco leverages expert knowledge of visualization applications in supporting project teams and pursuits. As the Director of Visualization at CannonDesign, Ernesto is a “Go-to” person for all project related aspects of visualization. He is primarily responsible for research and implementation of new technologies into the visual communications process. Ernesto started his career studying Architecture at the Universidad de las Americas-Puebla Mexico, before moving to the United States. He continued his studies in Interactive Design at Maryville University in St. Louis, MO. Ernesto has 20 years of experience in the Architectural field and has worked on several high-profile projects since joining CannonDesign.
ERNESTO PACHECO: Thank you for the invitation out of this. I'm happy to be here to present CannonDesign's use of USD in Omniverse and 3dsmax for visualization. My name is Ernesto Pacheco. I am the Director of Visualization at CannonDesign.
One of my primary functions at Cannon is to do research and development into the digital technologies for our design practice. I also deal with support and training on anything visualization, from renderings animation to XR deployment. And we also have a small group of theme-- visualization network of CannonDesign that deals with support as well and helps other teams throughout the firm.
CannonDesign is a global solutions design firm. We have designers that focus on solving those problems that directly affect humanity through architecture and design. We have offices in North America and Asia, and I am located in the St. Louis office.
We also work with clients in several sectors-- sports, education, health science, and technology. We tend to have that flexibility because we have a lot of talent within our firm.
So let's have this conversation about standardizing material creation for architectural design. One of the things that we have been looking at-- the firm is basically making sure that we are all using the same resources, and the resources that we are implementing into our pipeline are adequate.
Not only that, but also that they can continue to evolve. We want to make sure that we stay up-to-date with any technology that is being developed. So the question into interoperability starts with this-- how do you standardize material creation?
So we had some requirements when we're looking at this five, six years ago. We wanted to have something that was flexible; that means that we could output to any file format.
We needed to have access to high resolution outputs. Also, having this non-destructive environment where you can iterate using reusing the same data was important. Making sure that you have access to a way to create presets or save a library was important as well.
And in those industry adoption, of course, you know, it's one of the things that we look into when we decide to actually deploy any technology. So of course, we follow that PVR workflow. We knew that was the way to go. And we decided to go with Substance in this case.
As you can see, we use Substance Alchemist, Substance Designer, Source, and Painter. And we have all these rainbow of applications that basically can take advantage of the data that we generate with these files.
So again, that was the gateway into looking more in depth into software interoperability for AEC. So one of the things that we, as designers, have, or big headaches that we have day-to-day, is seeing this process taking a long time-- export and import and sharing files. We have all been there with deadlines-- overnight work hours.
So when we had these conversations with NVIDIA and Adobe about what was coming down the pike, we were excited as we saw it as the silver bullet that finally will kill the monster that we were tired of dealing with. So in this case, you know, Omniverse deals with a USD workflow, which means, for those that don't know, USD is Universal Screen Description file format that was created by Pixar in 2016.
And they developed this as their own solution-- in-house solution to the same problems that we are facing in architecture. So making sure that all the easy design apps that we use are supported is important, so we were very excited to know that they were working on connectors or plug-ins for this. Having access to that Substance support, Unreal Engine support, having a way to edit, live edit, and have this collaborative space, so to speak, where designers will come together-- that was very, very exciting, to be honest.
Getting access to top notch rendering engines like the implementations that NVIDIA has done to Omniverse in terms of ray tracing and path tracing, getting access to this content library that was robust, and most importantly, that was open source and Python friendly because we want to make sure that we can build our own apps if we have that need down the road.
So here comes Omniverse, and I'm going to go over a quick overview of what it is. This is a quick video showing how the Omniverse launcher looks like. You have that News tab where you can find out the latest and greatest news that's happening in Omniverse. Then you have your library where you have all the apps and connectors that you install.
Exchange is where you will find new apps or updates, for instance. And the most important part is this one-- this Nucleus. Nucleus is allowing all these applications to talk to each other. So you will have a dedicated server that everybody will connect to and then exchange files that way.
And then you have that Learn tab where you can find presentations and tutorials from different artists and people that are working with Omniverse on a daily basis. So in a nutshell, that's the launcher.
Now we're looking at Omniverse. Of course, you know, I can understand we're using Create and View for the most part. We have been testing it and exploring these two solutions. But you now have access to more apps like Machinima and also Blender for instance has being implemented into the launcher.
And then you have the connectors. These are the connectors that we are using. 3dsmax, Revit, Unreal Engine, of course, Rhino, and SketchUp.
So onto Omniverse Create. Here is a quick video showing one of the sample scenes that come with the software. It obviously supports animation in real time. In this case, I was using the real time render engine, which is a ray tracer, and I switched to the path tracer pretty quickly.
But what is very important about this is that the approach that NVIDIA had to implement this technology was very photographic. So if you have any photography background, you will feel right at home. All the tools that they use are very easy to understand. And then again, you can have some fun with directing your RPs in this case.
And then again, just to tell you a little bit more about what Omniverse Create is, so Create gets you access to content via the UI. If you have used any standalone real time application, you will feel right at home.
Omniverse is very easy to understand and pick up. It also has access to the console that allows you to read what's happening behind the curtain. The RTX viewport which is where you will spend most of your time building out your scenes.
You have stage, where you have all your data that was loaded. And then layers where you can actually create deltas or descriptions on those files. And then you have a couple of settings in a way to capture rendering in animations, and it supports animation.
So now onto Omniverse View. Here is an example of what it looks like to use it with Rhino. It is very simple. It's a one-click button and you are in Omniverse View. It works similarly to any rendering plugins that you may be using with your application. It has an amazing library of materials and also assets. And obviously, NVIDIA is continuing to work on this, so you will see more content being added.
It allows you to do quick sun studies, which is very convenient. If you just want to do that, you can actually generate a quick animation. So if you have pieces like this one, for instance, like something that is more non-descriptive, more conceptual, you can quickly generate some deliverables for your design meetings.
Again, all the materials that come with Omniverse View are MDLs and they also support PVR. And here is the sunlight study feature that I mentioned before. So again, the Omniverse View gives you access to assets-- that sun study tool, material browser. You also have a sky browser where you can bring in either dynamic skies, like the one that I show on that video, or you can also use [INAUDIBLE] images as well. So if you have a library, you can reuse that.
It has access to that RTX viewport which gives you access to the ray tracing and path tracing render engines. The stage, where you will bring in all your data, your 3D data. And then you have these RTX settings and the movie maker.
But what's important is that you can save this is a USD file. So if you start with View to quickly generate a visualization, you can actually save that USD and then open that in Create if you want to continue to develop it.
So now into Omniverse connectors. I'm just going to quickly show you how they look like. This is a Revit connector. Some of the features that come with that connector or plug-in is obviously a way to connect to the Nucleus.
You have USD publishing options that will give you access to material export options as well. If you want to bring in BIM information you can do that. And then you have that live sync or live edit implementation and the one button sent to View. If you want to do that, you have some settings for that.
In a similar way, 3dsmax has the same features. But one of the things that is different here is that check points, which is basically a description. I can see that being used as you continue to develop that project and you have options or just faces that you are going through, and you can quickly type in a description, which is metadata, that then the next user can use to inform themselves on what you were doing, how do you see your files being used on the pike.
And then the SketchUp connector has a little bit less features. And again, I think that this is intended just because SketchUp is a little bit more conceptual. But you still have that live sync. Nowadays, they just implemented it recently, which I think is pretty cool.
Same thing with the Rhino connector. It's very streamlined. You have some USD publishing options. And of course, you have that sent to you-- one click button.
Now onto the Unreal Engine connector. This one is a little bit more complex just because you already have a stage in Unreal Engine. So you can actually create a lot of things in Unreal, set it up, and then export it to create the final render-- any type of different exploration into the file.
But what's more important is that you have that live edit, which is different than live sync. A little bit different. That means that you have a two way connection, so if you have a project that you're working on Maya or 3dsmax, you can open that and live sync it with Unreal Engine. And then you can do the same from Unreal Engine to Omniverse, so you have a lot of flexibility. You can see this being used to put together more complex scenes where you have a large team-- 5, 6 people working on a visualization project. And I think that to me is very welcome to see that type of interaction between applications.
So, yes, the Dream Team, or what I call, is basically Omniverse and Substance just because we implemented that PVR workflow with Substance to standardize materials. Now we're trying to do the same with any 3D data that we generate at our firm. So it just makes sense for them to come together, at least for what we do. So I was very excited early-- late last year and early this year-- to be invited by Adobe to beta test and actually present on the Substance link for Omniverse.
This links allows you to bring in PVR materials into Omniverse Create from source-- from substance source-- which is a vast library of PVR materials if you are not aware of that. They are very impressive. All of these materials have presets that you can actually tweak as needed.
So you know, I was very excited to see that being implemented. Right now we don't have access to this link publicly. But it's coming down pretty soon. You will see that implemented as it's very important to have access to Substance link.
And the same thing for Substance Designer connector. I beta tested this one earlier this year for the presentation, and I was very blown away with how quick it was and all the options that you had to access within Create and Substance Designer.
So with that, we're going to move into the case studies. One of the first case studies was for this Ascension animation. It was a very exciting opportunity that we had to use Unreal Engine and the Cesium plugin.
Cesium is basically GIS-- well, it's an application that allows you to use GIS data and create this infinite world within Unreal Engine. So we wanted to quickly animate something, but the issue that I ran into with Unreal was rendering back then. It didn't-- you know, the Cesium plugin didn't allow you to tweak it very to detail, so I decided to explore this, connect these to Omniverse Create and do the rendering there.
But I wanted to test here; you know, like, I wanted to make sure that I didn't have to do some extra work, right? The animation was already done. I just wanted to make sure that I could render it to a higher quality using Create without having to do much work. And what that means is not having to worry about I'm going to spread it to obj, fpx, you name it and then worry about scale or orientation if the materials will come through. I wanted to make sure that Omniverse Create could just read that USD just as it was an Unreal engine file and just render it.
And I was very, very happy to see that it worked without any issues, any hiccups. And here is the animation that I ran through Create using the path tracer render. And in the next one, you will see-- you bring in these into Premiere or whatever video editing software that you use and quickly just toggle it with CrossFade or whatever. And you can see in a minute, I'm going to show the Unreal and Omniverse difference by hiding it.
You can see the shadows are a lot more soft in Create. Also, you have more detail on all those little buildings as opposed to the Unreal where it's almost lost.
So I was very happy with this exercise. I can see us doing more of these type of work. We do a lot of, obviously, animation work, background design, and having another way to connect applications, obviously, we're trying to standardize this to a file format.
So it's easy; as easy as that. You just pick it up and go to the next app without having to worry about anything else.
So the next case study-- this is the health care patient room. For this one I used Revit and 3dsmax. I wanted to make sure that I could iterate between the two applications.
I used Revit to bring in all the walls and all the architecture information then 3dsmax to tweak some of the equipment, as well as just to polish some of the materials. And then, of course, we use Substance. Substance Source, in this case.
I'm using that connector that I mentioned before. Again, this is not public yet, but you can see some of the things that they are working on quickly. If you download and bring in to create, drop that material and then tweak it within create, it's super fast. All of these materials, obviously, are physical based, so you don't have to worry about anything else. You just drag and drop, maybe do some UV mapping or some UV scaling just to make sure that it makes sense for the space.
But you can see you can quickly run a quick visualization for a design meeting or client meeting. And then you have-- here I'm looking at bringing in more data in terms of shaders, as well as playing with some of the lights that Omniverse Create has to offer. And you can see, I also 3D model a couple of things in Create. I ended up having some holes in the Revit model, but I quickly just generated a couple of boxes in Create and positioned them correctly.
And again, I'm just testing different presets from the Substance Source material because once you download that file, you actually have options to a couple of different presets that they include in the file. And then you can tweak those if you need to.
So again, it's almost painless just to go and drag and drop PVR materials and quickly just generate some visualization for your meetings. And in the next one here, I'm looking at 3dsmax trying to bring some more detail medical equipment into Omniverse Create, connecting to that USD stage that I already created with Revit, and then just having this conversation, right?
We're looking at having something more collaborative where you will have a couple of designers approaching the same project. You will have somebody working in 3dsmax, cleaning up some of that data that needs a little bit more detail, or you have a designer working on the architecture side of things still solving those design questions for the client.
So the idea here is that we continue to work on different things, different pieces, and then coming together to finalize something for our client. So I was very excited just to see this happening in real time. You can see you can delete-- you can move things, change materials, colors in 3dsmax and it will reflect in Omniverse Create. It is super exciting just to see that happening.
Obviously, it's all dependent to your hardware. You will see Create behaving a little bit more streamlined if you have access to the correct hardware and also if you are all located in the same server because you will have to, obviously, tap into that Nucleus to have this conversation between applications.
So that is something that we are actually exploring right now with Box. We're trying to investigate if we can virtualize this instead of having it dedicated locally in our server. We can potentially use it as a service with Box instead using a virtualized environment and then just tapping into those machines when we need it.
So here is the final product. Again, just tweaking some of the lights, looking at the materials, making sure that I am satisfied. You know, you have, obviously, as you're bringing your own data from 3dsmax, they become these independent pieces that you can then tweak as needed. So you can be your own creative director for a particular piece or animation.
So that's the beauty of it, right? You have all these designers and users coming together into one final piece, and then having that opportunity to do more creative, a more creative approach to the problem just because you don't have to spend that time that you used to do with exporting and importing, communicating design options, and all of that. We're hoping that Omniverse will solve some of those questions for us and it will make it easier.
So here are some of the renderings, and then we will move to the next case study. So now we're moving to the next case study. This is a Restorative Care Village project.
For this one, the problem they wanted to solve was trying to mimic that look of a physical scale model. Right before the pandemic, we used to create these physical models almost every month-- every other week to be honest-- and then document these physical scale models with photographs or videos and share that with clients.
So I want to see how much I could push Omniverse to get to that point-- like hyper realistic, and also wanted to test more connections-- 3dsmax in this case. We used Infraworks to generate that site and then bring that into 3dsmax for cleanup and also some modifications to the site just to make sure that the buildings will fit where they belong.
So again, here I'm using the Substance link, linking to the source, and then just making sure that I'm downloading those PVR materials; applying those to the site and the buildings, playing with some lights, getting the overall look together. We're trying to get to that point where I'm happy with the light, and then we can tweak a little the materials. And it's kind of like a push and pull type of play that happens here.
And it's all fun, to be honest. Like I never really struggle with any of this. It was-- it gave me time to think about what I wanted to accomplish with this visualization piece. And I was fascinated to see this coming to life right in front of my eyes. Especially once the path tracer was engaged and you had that beautiful lighting and also tweaking the camera settings to the point that they mimic a physical camera.
I hadn't seen these in a long time. I mean just to use Maxwell render in the past, and that was the closest I have ever gotten to a photographic approach to rendering. So I was happy to see that happening with Omniverse and the path tracer.
Here you can see me tweaking that side in real time in 3dsmax just to make sure that the buildings fit correctly within the space, tweaking some of those holes that I ended up having just because I decimated that side a little bit just to make sure that it wasn't too heavy and I could actually move around. So as you can see, it's super easy, responsive, and again, you have to have access to a nice set up; a nice computer and also make sure that either you are running this locally or you have your own server, and everybody is tapping into the same location so that communication is fast.
So again, just kind of going around with Create clicking, looking at my cameras, making sure that I was satisfied with the compositions, and then finally switching to that path tracer. Here is a really fun piece. You know, we usually use metal cable trees for the physical scale model, so I wanted to recreate that in 3dsmax. So I ended up connecting that to Omniverse, and it was super satisfying just to see that happening in real time.
And here are some of the renderings. Again, you can see the quality of the path tracer is impressive. I am so happy with this, and this is kind of like the video of me picking the light and whatnot in real time. It was super responsive.
You can see the quality of those materials. It's just impressive, combining the power of those substances with the path tracer in Omniverse was just-- I felt like it was Christmas, to be honest. I was so happy to see it. You could almost reach through the viewport and touch things. That physical-- it was almost to that point that you feel like you're actually touching a piece of paper, you know?
So, moving to the next case study. This is the Gilardi House by architect Louis Barragan, one of my favorite Mexican architects. So I always have this 3D scene ready to test any new technology or render engine that I'm trying to bring into our design practice. So I decided to give it a shot with Omniverse.
And the goal of this one was actually to connect a little bit more applications, so I was looking at SketchUp, 3dsmax, Substance Designer, and Substance Source. So it was a little bit more complicated, a lot more moving pieces to take care of. Obviously, I was working with NVIDIA at the same time. They are very happy always to help you and look at things not working correctly and give you some tips and tricks. So it was a very enjoyable exploration that I had with this case study.
Same thing with Adobe. Obviously, working with them to develop that link to Substance Source and Substance Designer was also very rewarding.
So again, I'm just kind of like tweaking here-- lights, trying different settings for furniture, and then the next iteration was to get that final look. Once you start applying physical based materials, you get more detail on those shadows, you know lighting behaves a little bit better when you are use substance PVR materials.
As you can see here, there is a lot of detail, and then I use Alchemist to quickly generate a new material for the floor. I just wanted to see how easy it will be to just export that as a Substance file and then just use either Substance Designer to bring it in or just the link that Adobe provided for Create. And it was painless. This is actually how it worked.
We have somebody working on Alchemist, creating these specialty or special PVRs for designers, and then giving them to for them as PVR material to use on their own software. So here you can see SketchUp. I created this, well, this sculpture, again, by Louis Barragan.
I just minimized that and just brought it in to Create. And then you can see these live 3D modeling in 3dsmax being reflected in Create. It was very, very exciting to see this happening right in front of you.
So again, just dragging and dropping materials; downloading and testing different settings. It was all fun. It was all rewarding, and also just trying to make sure that we are pushing those boundaries with the tool, right?
We wanted to make sure that we could test it to no end; making sure that we could say, OK, it's robust enough that we can start looking at production work.
Here is another example of me using Substance Designer to quickly create this material for the mangoes. And again, I had different fruits in the 3dsmax file that I would toggle. So 3dsmax comes with this little Max Quick plug-in that allows you to toggle different groups, so it's very exciting to see that in real time as well.
So I'm quickly generating something, and then if I don't like it, I can just switch it. Then I can modify it. I can rotate it. I can reposition.
So I'm playing creative director here. I'm trying to get to that nice look of before I hit Render, obviously. It's all real time, and it's looking as close as it's going to get to the final product. So, you know, second guessing is almost out of the door by now. And again, here you can see that me toggling different fruit on that table just to see what I liked the most for the final look.
And quickly, just using again Substance Designer to create something that was more, I guess, that was more interactive; something that had more options. So I was looking at this piece of a floor that I wanted to use some of those exposed parameters that you can get with Substance Designer or the presets that you can create with Substance Designer and see how quickly I could toggle.
And again, this is an exploration of having a user working specifically on Substance Designer to create these custom maps for you while you are working in Create to get the rendering of the animation ready for a deliverable.
And here are some of the images-- final images that I ended up rendering just in Create and the path tracer. A couple more images so you can see the quality of the light and everything-- reflections-- it's just impressive. And a quick video of me just playing with the settings in Create, in real time, and I'm using the de-noiser so you see that paint effect, which is also super cool when you see it live.
You know, NVIDIA ended up using that paint effect whenever you move, and then it kind of settles into a final look once you stop. But again, you may-- tweaking the light in real time, seeing those reflections, changing the colors, it was just an amazing experience.
Again, we're continuing this exploration. Right now we're working, again, with Box to explore this in a virtualized environment for production work. So we have a couple of designers actually engaged on testing these applications. So in the next year, we will be, obviously, ready to present on something a little bit more robust; something that more describes that collaborative design the collaborative design aspect of what we do at CannonDesign because, at the end of the day, that's what we want to accomplish.
We want to make sure that we give our users the tools that they need and also open doors so they can have more conversations and educated design process as they work on their craft. So yes, the end product was successful for this exploration. And this is what I call the real-time design galore.
You can test multiple material pallets, you have a non-destructive design iteration type of approach; you also have access to PVR materials. The workflow is flexible just because, again, you're not really thinking about that process of exporting and importing materials or 3D data. You can just focus on your craft. You can dedicate your time to what really matters, right?
And also getting access to these ray tracing and path tracing render engines that NVIDIA have implemented into Create. I mean, they-- I haven't seen anything like that in a long time, so that's the part that excites me the most, because I deal with visualization most of the time. So I was really happy-- it was really, really fun to see that being implemented and also using the tools that they have created.
And then getting access to Substance 3D, making sure that we can continue using all the PVR materials that we have created since we implemented the PVR workflow into our design practice. That was very important getting access to all the software that we use for design work, and also seeing these being adopted.
You know, you see a lot of different tech companies and software companies adopting Omniverse, which is a good sign. You know, we are actually looking at the beginning of a new way to working with our applications. That's exactly what we will see in the next couple of years-- the development of these tools to create an open space where you can actually come in and do the work and don't really stress out about anything that is outside that parameter, right? Just design and do your work; solve those problems.
So again, thank you to Autodesk for the invitation. This was my presentation. I hope I can see you at the Q&A.
Here is my information. Please follow me on Twitter or LinkedIn. You can quickly just check those links. And again if you have any questions, don't hesitate to send me an email. Thank you.