说明
主要学习内容
- Learn how to confidently and efficiently prepare an intelligent model for use in a real-time environment.
- Learn how to assess a complex data set and how to organize and clean the model to be better optimized for real-time use.
- Learn how to identify and troubleshoot potential problem areas with the model.
- Learn how to integrate a real-time rendering workflow into your design process.
讲师
- MRMatt RichardsonMatt Richardson is an Art Director at Neoscape, a full-service creative studio that partners with clients across architecture and real estate to create tailored property marketing content and experiences that bring the built environment to life. Working out of the Boston studio, Matt has been part of the Neoscape team for over 17 years. After many years of leading a talented group of digital artists on the 3d production team, Matt now heads up the Unreal development team creating interactive and engaging real-time experiences for his clients. Matt has a passion for learning, exploring emerging technologies, and establishing new workflows and pipeline efficiencies for the team. Matt is well-organized, an excellent problem solver, troubleshooter, and manager. His goal is to see the bigger picture to ensure client's needs are met. His technological expertise and attention to detail are invaluable when developing the sophisticated programs required to deliver projects with large, extensive scopes of work. Some notable projects include work done for Skanska, Related Beal, and ERA-co / Woods Bagot.
MATT RICHARDSON: Welcome to From Revit to a Real-Time Rendering. My name is Matt Richardson. I'm an art director at Neoscape. For those of you who might not know, Neoscape is a full-service creative studio. We're headquartered in Boston, which I'm out of that office. We also have offices in New York, Berkeley, California, Chicago, and then some other folks around the country. We're about 100 employees right now.
I've been with the company for 17 years now. For a long time, I led one of our more traditional 3D production teams. But, recently, I've switched over to our interactive team. And I manage a group of talented Unreal artists and we create real-time experiences for clients.
One of those experiences is 380 Stuart Street here in Boston, recently completed project for Skanska, new office tower in Boston's Back Bay. They are building a leasing center to showcase what the building has to offer before it is complete. And one of the main installations in that center is a large video wall with a Unreal experience of the project running where they can interact with it with a controller, walk around the project, explore the various different levels, interact with a number of items as well.
So just wanted to introduce the project. We will be showing-- I will be showing this as a sample data set throughout. So just wanted to thank the Skanska and CBT teams for allowing us to use the project and show it in the presentation. And just to note that the design credit and model goes to CBT. So thanks again. With that, let's get into the presentation.
So this is taken from the course website just to summarize why we're here. This is a technical instruction using 3ds Max, Revit, and Unreal Engine and showing that workflow on how to get a data set from Revit ultimately processed through Max and then ultimately into Unreal. Here are the three software packages I will be showing. This prerequisite knowledge is not mandatory. It's more what I've geared the demos to.
But there may be folks who are much more familiar with Revit, less so with Max. Some people may have much more game or Unreal experience. But we certainly welcome anyone to attend. But this is just what the presentation materials will be geared towards.
And then just to recap our learning objectives, again, from the course website-- I won't read these outright. But just to keep these in mind as we are going through the presentation. I don't think they are necessarily cut and dry on where they apply like steps 1, 2, and 3 only apply to the first learning objective. They're more kind of interwoven throughout. But, hopefully, by the end of the presentation, we can look back at these and see how we touched upon these points along the way.
A quick overview before we jump into the detailed steps. So I like to ask this question really for any project, but especially for a project where we will be bringing a data set into a real-time environment is kind of knowing what our end goal is. Why are we doing this? Real-time is great. It's extremely powerful.
You can do a lot with it, but it may not be right for every project or maybe not at every phase of a project. So understanding where we want to end up at the end will help inform some decisions along the way. And we'll get to those decisions in a moment.
So first off, there's lots of different real-time rendering options out there. I link to this article that I thought was interesting from the chaos blog. There's a corresponding podcast as well if you follow the link later. I won't go through that right now. But it describes the differences between the three R's as they call it, real time, ray tracing, and rasterized rendering.
The main takeaway is what is real-time? Traditionally, it's described as having your computer render or generate frames fast enough for you to interact with seemingly in real-time. In a game experience, that is traditionally 60 frames a second, which equates to a normal monitor refresh rate of 60 hertz to make it seamless. In architecture or film or video rather, 24 or 30 frames a second is acceptable.
So I would say if you're in the 30 to 60 range for more architecture presentations, you're golden. But we'll get into some of the reasons why things may be slower and what we can do to optimize those.
Choosing the right tool for the job. So we'll be using Unreal Engine for this, but there are a lot of other options out there. Twinmotion, Enscape, Lumion are certainly great with Revit integration and getting some quick visualization out of that. Chaos Vantage if you're more of a 3d Max V-ray user.
With Vantage 2.0, which just launched this summer, it's a fantastic visualization tool, real-time visualization tool and really, really getting that quality to come across. And then there's a ton more game engines as well. Again, we'll be using Unreal for this, but it's finding the right tool for what you need to produce. So system requirements and hardware is a big one. You need to make sure that your local system is capable for developing the experience, but also what is this going to run on in the end?
Does your client-- have they invested in some hardware? Will you work with them to advise on what hardware they should get to run that? So kind of knowing the specs up front. You don't want to develop this fantastic presentation or experience that no one can run because the hardware limitations are too high. So you kind want to define that hardware really before you start and have that on hand so you can test along the way and make sure that you are staying within those bounds.
And then, finally, what does the client ultimately need because this will determine what tool, going back to the right tool for the job. Do they ultimately want a game experience that they can interact with and walk around and really get immersed?
Do they only care to have stills or animation or 360 generated, which certainly offline rendering is still a great tool for that? But real-time environments can generate those images much more quickly. So we still use real-time environments like Unreal to generate animations. And that would be the only deliverable. There is no experience.
But, again, we're still leveraging the real-time development environment to iterate faster to ultimately generate these takeaways for the client. Or is it just a presentation tool? You might want to consider Twinmotion or Enscape or even Vantage for that, but also presentation tool you can do in a game engine. So it's kind of knowing what the end goal is, going back to that question we asked and then choosing the right tool for the job. A quick overview on the kind of start to finish process.
There's going to be-- we'll break this down into two sections. There's before we get to real-time and then once we're in real time. So before we get to real-time, the main topics we'll cover are receiving the base data set. For this, we'll be talking about Revit. Then preparing that model-- again, we'll be using 3ds Max for this. But excuse me, there could be another software package that you use as an intermediary. And then, finally, exporting that prepared data set for use in real-time.
And then once we get there, we'll do an initial Unreal Engine setup, going through some things to keep in mind as we're getting started. Want to talk about this blocking and optimization. What does that mean exactly? So that's kind of a process that we use here internally at Neoscape to allow for more rapid testing and iterations but also to not have to wait for final optimized assets to be ready using blocked out versions of those earlier on to make progress on what we can while the final refinements are being done.
And then, lastly, we'll touch upon the output and delivery. This is a extremely simplified data flow, but this is generally showing how the data is transferred. You will notice on the top that there is one path that bypasses Max completely, which I'll touch upon. But you can go from Revit to Unreal through Datasmith. There is a plugin for Revit, and it's actually integrated now with Revit 2024.
But, generally, we'll be showing the Revit exporting to FBX, importing to Max. And then from Max, there are two paths we can take, which we probably would use both. It's not either/or. I think certain assets are better to go through Datasmith and others through FBX. But we'll cover the whys of that down the line.
Jumping into the specific steps here before we get to real-time, starting with the Revit data set, where did the data set come from? I think the main takeaway here is, don't take it for granted. In our business, we never would create the Revit model ourselves. We always receive it from a third party, whether we're working for an architect directly or working for a developer and there's an architect a part of the project team. We'll be receiving that from an outside source.
So just not to be skeptical, but don't assume that whatever you receive is going to be 100% what you need, not to say that it was modeled incorrectly or anything like that. That's not what I'm trying to say. It's more, is it positioned the best for our use, meaning our visualization use? Just to kind of keep that in mind. But if you did create the model, then all the better. You have a leg up. You know the inner workings of it and can make changes more efficiently than someone who didn't create the model.
So the first thing to do is kind of assess the data set. Look at all the pieces. Is it one Revit file? Is it multiple files that are linked in? Determining what may be needed and what could be discarded for now. We're not going to purge it completely. There will still be a record of it. But once we're in real-time, every object counts. Polygon, with the latest versions of Unreal, it can handle denser geometry, but we still want to be cognizant of polygon count. But object count is certainly a limiting factor. Really, from any software package, the more objects, the worse it's going to perform.
But in real-time, that really comes to a head. So we want to be pretty efficient on how we collapse and organize our model. So ultimately, what we export and use from Revit will be generated from a 3D view. And whatever geometry is visible in that view is what will be exported, or ultimately, imported into 3ds Max. I'll show just some steps on how we might thin out the model, turning off some links that may not be needed. Those are great for reference and to go back to if you may need something like a mechanical system.
You may think you may not need it at the beginning, but then as you're exploring the space, there might be some exposed ductwork or something that you say, oh, actually, I do need that. So then going back and kind of exporting the pieces that you need down the line. But I would say start leaner and then add to it later, because if you start with a really heavy and bloated data set, it's going to be difficult to work with and manage that many objects. So breaking things into pieces and working on smaller parts is definitely preferred.
I'll show just some things that, from my experience, a non-native Revit user-- I've gotten familiar with it more over the years-- but just knowing where to look within the software to manage different parts of the model and ultimately to get out what you need.
When I first started and would just receive a Revit set, I'd just import it directly and then wonder why such-and-such objects weren't visible, or why is it showing multiple phases of the project overlapped on top of each other and now collapsed down into one mesh and impossible to work with? And then realizing, oh, there are project phases in Revit. And maybe the view that I imported had all of them showing, and now, they're all in Max. So knowing how to manage those and set up your view so it only has the objects you want.
And then finally, exporting the model. Show a few different ways-- going to FBX, like I mentioned earlier. You can go to Datasmith directly. But then once we get into the Max side of things, you can import a Revit file directly. You don't technically have to export anything. But you do still need to prepare your 3D view, because that is what would be imported into Max.
So the next series of slides are just some screen capture demos within Revit. I'll hit Play and talk over them. I probably could do this live, but didn't want to run the risk of things hanging or crashing. So with that said, we have our Revit model open. I first explore the links, and we see that there's some coordination models. So I open up the Manage Links window.
And I'm going to unload them for now, not to delete them from the project, but just turn them off, just so as we're opening new 3D views, it's not needing to generate that data. And we'll just make things smoother as we progress.
So now, I'm going up to the View tab and clicking on a new 3D view. As you'll see, there are some views already in the project, but I don't know how those were created. So I want to create my own, clean view with everything visible, which, that will pop up here in a moment. So here's our whole data set. Just kind of orbit around here, and then I will jump to the next video. But you can see all of the architecture, at least, is visible here.
Next. So here's where we just check a few things on the-- OK, so I actually duplicated the view, just to have a backup. We're going to be changing and turning off some visibility on this model. So I did want a backup just in case we want to go back to a fully displayed view. So we'll be working on the copy for now. There's just going down, checking the phasing. For this project in particular, it's all new construction, and it's all in one phase. So those weren't set up, but that's where you would look to check for phases if your project did have that.
Now, I go into the visibility graphic overrides. There's different model and annotation categories. You can enable or disable. Here, I'm just going to turn off the Levels annotation just to have a cleaner viewport. There's no reason why I'm doing that for exporting needs, but just to show the functionality. The links are also local to the view. You don't necessarily have to disable them for the whole project. You could disable them for a view. And then finally, going into the work sets, here, I'm hiding all work sets except one, just to have a reduced data set visible here, which, ultimately, that's what we will export and bring into Max as a kind of more smaller component.
Just a few other things to look out for. Maybe not as typical, but section boxes can be used to limit the 3D view, and whatever is cut in that section will be cut from the exported model. Just showing how to slice the building if you wanted, say, just to isolate a certain floor, or you didn't care about the tower portion. You could bring it all the way down to the lobby level. And then some scope boxes, which, these were previously defined in the project that I just toggle on. This one, I believe, it just-- yeah, it just shows the lobby.
But I will turn these off, go back to our full building. We're moving on. OK, Revit, Export. So this is pretty straightforward. Just go to the File menu, Export, FBX, make sure you're in the 3D view that you want to export. I'm just going to rename this to something a little shorter. But that's really all there is to it. You just hit Save and then wait. This one took half an hour, maybe more, to export just because it's so heavy. We'll show how many objects that actually exported once we get into Max.
And then the last bit here, just quickly touching on the Datasmith export functionality, I mentioned earlier that as of Revit 2024, the Datasmith export plugin is integrated within Revit. Previously, you would need to download the plugin-- free plugin-- from the Unreal Engine website to install it, which, that's what I did here. I'm in Revit '23. But it installs it to the top menu bar. You can pick your 3D view and simply click Export Datasmith File.
You'll notice that there is some direct link functionality on the left. I won't be covering any direct link workflows in this presentation, but that's certainly a viable workflow that you can take advantage of, whether you're going into Twinmotion-- you can direct link into Unreal as well. For this, we'll just be doing clean export and import. But direct link is a powerful tool, and especially, if design may still be in flux and not totally locked, that would be maybe a way to go so you're not committing to importing this and then having no ability to get updates quickly.
In our case, with this project, they were at a GMP permitting set phase of the project. So the design was pretty much locked and we weren't going to be receiving any updated models down the line. I say that. We did receive updated models. As you know, things change. But for the most part, everything was finalized when we received the data set, which made some of our choices a little easier to make, and why we were more comfortable using Unreal Engine and importing these data sets without the linking.
But depending on where you are at with your design, you may choose to use a different tool perhaps early on. Maybe you do use a Twinmotion more at the beginning, and then once the design gets more finalized, that's when you might bring it into Unreal. So there's different ways you can tackle this.
Now, moving into 3ds Max just to talk about a few setup things, which, we'll run through some videos in a moment. But talk about scene scale quickly. In Unreal, one unit equals one centimeter. So that's one thing to switch before you import anything, just so your scene units are correct.
I'll show the FBX importing and Revit directly importing processes, talk quickly about collapsing strategies, whether you do it manually, use some of the automated tools, or not at all. And there may be a case for all three to be used on one project. And instancing wherever possible. Datasmith preserves instance meshes once you bring them into Unreal.
So being cognizant of what objects are instance in Max, or creating instances that may not be set up from the Revit import, but you can identify that, oh, these objects are in fact, the same. Let me re-instance them so that when we get into Unreal, things are more efficient, because if not, then you're going to have basically duplicate copies of the same mesh that are generated on runtime that is just unnecessary. So it's really trying to be efficient and lean as much as possible.
So going through a few videos here just showing where in the unit setup. So there's the display units, which are different from the system units. This is where we want to switch to centimeters for our system. But the display units could be either/or, whatever your preference is. We'll keep it in feet and inches, but you could switch it to metric. It's not going to change the system scale.
And here, we went into our File, Import and selected our FBX that we exported a moment ago from Revit. There are a couple of presets. I typically just use the Revit preset, but there's a media and entertainment one that has a little more options exposed. But the Revit preset should be fine for most needs.
This took quite a while to import. I don't have an exact stopwatch on it, but it was a while. But you can see that there are 61,909 objects that are brought in, which is a lot to manage and have the viewport display.
So I'm going to do a couple of things here to prep this. First, I'm going to maximize one view so it doesn't have to draw it four times. And then the next thing is, I'm going to turn off geometry. Just want to hide all geometry so your viewport doesn't have to worry about rendering that.
You'll notice the white background. It imports. Since there was a sun and sky set up in Revit, it's going to put a map there. So I just disabled that. And you'll now see that there's a camera and a daylight system that it imported. But we don't need any of this. So I'm just going to blast this away just to get it a little cleaner.
Next, I will show you the scene explorer and all the objects. They're named how they were named in Revit, just with identifier tags at the end. And I'm going to show just a quick collapsing strategy. So if you go to Utilities panel and the Collapse tool, Collapse Selected, pretty quick. In the past, this was a much longer process, and you would cross your fingers and make sure that Max didn't crash during this, because it definitely did.
Not sure what they did over the years, but you can trust it more now. So using this is my preferred method for collapsing objects. If you want to do it manually-- and there's an argument to be made about doing it manually, knowing that you have full control over what objects get collapsed and how they get collapsed.
And just sometimes, when you pre-collapse something, it may attach things that you then have to detach later, or sometimes, it's hard to separate them, because the objects may be right on top of each other. So this manual process obviously takes much longer time, but it may have more verifiable results. So it's kind of somewhat personal preference. But I did want to show that. But I would suggest having the geometry hidden, like I have here.
Otherwise, it will have to redraw the viewport every time, which is just going to take longer. And longer and you'll notice that I also went down to the bottom of the Scene Explorer. When it finishes collapsing, it will drop you, basically, where you left off on the slider.
So if you tried to collapse a chunk of objects at the top of the list, which I think will show here in a moment, once it's done collapsing, it may drop you in the middle. And then you have to scroll up and down to find where you left off. So I like to do it from the bottom up. But again, that's just a personal preference thing. Yeah, I tried to get all the vision glass, which, there's more objects here. But the speed at which it did it is comparable, and pretty fast overall. But you'll see, I have to scroll up to say where it left off.
So that was the FBX import. The Revit import-- similar. I'll pause it here. So similar to the direct link Datasmith export in Unreal, there are linking options in Max. We could have linked our FBX file. We can also link our Revit file. Again, for this, we're just going to be doing straight importing/exporting, so I won't touch on those. But they are there, and useful for certain cases.
Now, oops. [INAUDIBLE]. So here, we're going to actually select the Revit file, not the FBX. And after a moment, this window will pop up. Here, you have all the 3D views in the file. So we will pick the copy version that we reduced down. And there's different ways to collapse or combine objects. You could choose not to combine them at all. But for this, we're going to say we want to combine by Revit family type and Revit material, just as an example. We also will turn off cameras, lighting, but keep materials.
And then when that comes back, you are presented with this view, where, by default, it selects all the family types, but you can uncheck certain ones. If you knew you didn't want certain family types, you could uncheck them. But for this case, we will import everything. And the number in parentheses is how many versions of that family type it will create based on how many unique materials were applied to it. So most of them had ones, but some had twos, threes, fours. So you would get, basically, four collapsed objects, but then a material tag applied to the end to say what material it is.
Here, we're going to jump into 3ds Max and attempt to do some live demos here. This is that collapse by family version we just brought in. So the first time we brought in the FBX, it was around 61,000 objects. This one, the collapsed version is 318. And you'll see here that each family type has one object.
Let's find an example where there were multiple materials. So this 2 and 5/8 frame, we had one version that had a glass material, we had one that had a aluminum, one that had a door hardware, and one that had a wood finish. So that's why we get four different objects. If we had simply done collapse by Revit material-- sorry, by Revit family type, no material, then it would have collapsed all four of these objects into one and created a multi-sub material.
You would still have all the materials, but they would be nested in a multi-sub. So it's more kind of preference on how you like to work. I like to not use multi-subs as much as possible, especially when we're preparing data sets like this. But that is an option as well.
So you'll see here that-- this is the collapsed version again. And we did limit our work set. But this is kind of more going back to taking the model for granted. We didn't create those work sets, or establish those. So it said building shell, but there's clearly other objects on here, which is fine. We can delete that out, no problem.
But since it's collapsed now, if we try to select all these objects down here, you can see that they're attached to objects within the tower portion. So we would have to manually delete-- or detach pieces here, and that's just going to take some time as well. So for this project, we ultimately went with the first method, where we imported every piece and then manually collapsed, and we could control exactly how this model was put together.
The one thing I didn't touch on was importing Revit directly with no collapse. That would be very similar to the first method of the FBX import. So whether you export to an FBX or import Revit directly, no collapsing, it's very similar results. I'm sure there are some technicalities on what is different, but you're basically still getting a one for one for every object from Revit and into Max.
So kind of jumping ahead to take the prepared cake out of the oven, so to speak, here is our final curtainwall model that we cleaned and organized and prepared for real-time use. It took some time, that's for sure. But we feel like we have a pretty efficient and easy to use data set that we can now bring into Unreal. How many objects was this? 318. I know there's a lot of other pieces that we cleared out. But this model is now 48 objects.
And we have it split up into two layers of glass. Actually, the building was triple-paned. We decided to eliminate one of the panes of glass. We felt like three panes of glass was overkill for real-time. We just didn't need those polygons. But also, trying to render through three translucent surfaces during runtime was just a bad idea. Even the double-pane is probably more than we need.
But we decided to keep the two, and we could make the decision down the line if we wanted to disable one layer of glass if we're running into some performance issues. So this glass A is the outer layer, the glass B is this inner layer. There's a frit pattern that is sandwiched in between, which is on its own layer. And then we have this solid layer, which all the shadow box and mullions and door frames that are collapsed together on a somewhat per-floor basis. I'll show an elevation here. So let me just isolate the glass only.
So we have a-- calling a stack zero, which is the bottom layer, a stack one, which is this next layer and then a bunch of stack twos, which, because it has this AB stack, that really is just rotated up the tower. We were able to use instance geometry here and simply rotate that 180 degrees to create that version on the other side.
So we have 12 objects for this glass layer, but there are only 1, 2, 3, 4 unique meshes. And when we import that into Unreal, it will only have four static meshes that it's referencing, and all of these stack twos will reference that same mesh. So this is a layer of organization that we went through and identified that we felt would be the most efficient way to put this model together. There's different approaches. This isn't the best way. Is it not the only way.
But it's what worked for us, and allowed us to keep this fairly lean and just trying to-- it is, I believe, two floors are grouped together, because we did have this balcony situation that it just made more logical sense to have the two floors collapsed together to manage that up and down the building stack.
So that's just talking a little bit about the strategy on how we collapsed and organized and instanced objects that may not have been prior instance. We did it on one of the lower floors and then re-instanced that up the stack once that model was prepped. So there are a few more steps to do, but this is a lot of model information to carry through and render in real time. So we wanted it to be as efficient and performative as possible.
I will jump back to the presentation So here, kind of moving on to the next step, we did more organization, collapsing strategies. But there's a lot to look at with the geometry itself. Are there any issues with it? Normals and faces is a big one. There might be holes or gaps or co-planar surfaces that all of these will be more apparent once you're in real-time; coplanar surfaces will flicker. You'll have z-fighting on those.
Holes and gaps, which may not be as apparent in the Max viewport or Revit viewport, but once you have lighting permeating the space, you'll get light leaks and other issues if there are those openings in your geometry. So I'll talk about some strategies on how we identify them, because sometimes, it's not as easy to identify in Max. So even what I've like to do is basically, don't do any cleanup to start with.
Maybe you do some collapsing just so it's a little lighter weight. But then bring that data set as a placeholder throwaway, if you will, into Unreal, put a simple lighting scenario, and walk around it just to see where the issues are, because once you have that real-time lighting, especially, like, the GI bounce happening, you'll see those problem areas a little more clearly, and then you can identify what objects they're on and go back to Max and fix them. And since it was a throwaway, you didn't spend much time cleaning it up to begin with.
Adding mesh detail. This is a balance, because if we're doing offline rendering, we might add a lot more mesh detail with chamfering, and adding reveals, and breaking things up into components. Revit is not meant to have all that mesh detail on every object. The architecture, maybe, but some of the other components and family types, like some of the-- I'm thinking, for this project, we had to include things like fire extinguisher boxes and smoke detectors, which were identified in the electrical model from Revit, but they didn't have the detail we needed, which was fine. You don't need that detail in Revit.
But knowing where we need to add that detail to those models in Max, but also balancing how heavy to make them versus performance. You always want to keep the polygon count in mind. So making them look good enough from the views that you'll be looking at them from, but not going overboard and adding tons and tons of mesh smooth as an unnecessary detail to the geometry.
Pivot points is a big one, more so for set dressing and props that will be placed around. But things like doors that may want to swing open, having pivots placed properly on those so you can interact with them in Unreal. We'll talk about what to look out for those.
Negative scale values is a big one. Unreal does not like negative scale values. It can have some weird behavior. It may appear that things are inside out. And really, non-uniform scale in general. You really want your objects to be 100% scale in all dimensions and no negative values. And mirroring is a tool that is useful. And Max and other software packages doesn't care as much about negative value, but when rendering in real-time, it does. So just be on the lookout for those.
And then if need be, there are some quick tools to reset these values, whether it's scale or remaking a pivot to a particular place on the object. So I'll show a few tools there. OK, let's jump back into Max and have this file prepped.
So there are a few layers here, just different geometry examples. These are all taken from the same data set. I've just saved them out into individual pieces. So this first one is the canopy that's on the lower level of the building. And you look in the viewport, and it looks good. It looks clean. There's no apparent issues.
So one thing in Max that it does in the viewport to help you see things in a better light, so to speak, is in the active viewport settings, which, again, I right-clicked on this middle dropdown and then go back down to active viewport settings. You'll see that lighting and shadows default lights-- it's set to two default lights. I'm going to switch this to one default light to expose the issues.
So now, you'll see there's all sorts of problems with this. It seems like a normal problem and flip faces. This is a piece that came in as one piece from Revit. There was no collapsing here. So this is the whole unit from Revit and how it imported. So just to confirm, we'll go into our face mode, polygon mode, and show normals and select a couple of these faces. So let's see the best way to illustrate it. So you'll see that it is, in fact, a normal issue. These are pointing up and these ones are pointing down.
So there are a couple of ways we can troubleshoot this. The first thing to try is using the normal modifier. Adding that to the stack, by default, it's just going to flip the normals. So you can see, it's just flipping them. That's not really helpful for us. So I'm going to uncheck that and check Unified Normals instead, which, it did a decent job. Turn it off. But it fixed most things. But there are still some areas where it gets a little jacked.
So in this case, you would need to go in. You could add another edit mesh to the stack, go in and manually flip certain faces. But then you get some smoothing issues. So there may be a case where you just say, oh, I'm just going to rebuild this object. I know I can rebuild it faster than it would be for me to fix it. So it's kind of personal preference and what the task is at hand.
But just to know that there are some tools here that you could-- I mean, think with this, I just selected all the faces and flipped them manually. That didn't get caught with the unified normals. And then and then the mesh was definitely much better at that point.
OK, this next example. So here, we had a number of structural pieces that-- there were over 100 pieces. They weren't instanced, so we weren't really preserving any instances. We probably could have determined what unique beams-- or beams that were being reused that we could create instances. But for this, it was simple enough geometry. They're not that heavy. We said, let's just collapse it all. It's one floor's worth of structural beams.
So selected it all, went to the Utilities panel and the Collapse tool we showed earlier, and say Collapse Selected. But watch what happens. Now, after collapsing, a bunch of faces appear to get flipped, which, at first, it seems, oh, why would the normals get flipped when you collapse it?
And if you investigate, you'll see, oh, the normals actually are pointing in the correct direction. And you can confirm that by putting a normal modifier. And when you try flipping and unifying them, it's not doing anything, because they are acting properly.
But what it is is a smoothing issue. Don't ask me why it happens. But if you add just the smooth modifier, it will clean that up, and now, they will display properly. So we talked about normals, faces, smoothing. Here's an example with some chairs.
This definitely seemed like a normal issue. And in this case, adding the normal modifier and unifying the normals fixed it. There was no manual flipping needed. You can be critical and determine if the mesh itself will hold up for what you need in a lot of cases. With this project, it had a great set of placeholder furniture that, we used a lot of it, or we added some light detailing to it. Others, we had to replace with optimized assets or different versions. Maybe the chair aspect was different than what was in Revit. But for the most part, we were able to utilize a lot of these furniture pieces, which was great.
A couple other things to identify here is that I'm selecting all these objects. And you'll notice that none of these are instances. If I select one here and say-- I can't even select instance because it's not an instance. So we have how many? 11 unique objects. It is the same mesh, but if we were to bring that into Unreal, we're going to get 11 copies of the same thing and having to generate that at runtime.
The other thing here is that the pivots are set to the object center, which is fine. But with props and set dressing and things that we're going to be placing around, we really want the pivot to be at the bottom, the base center of the object. And that's going to allow us to snap things better, but also, if we want to replace a chair down the line, there's an easy-- it's very easy to swap meshes with a different mesh in Unreal.
But it will do it based on the pivot point. And if the bounding box is different for your new mesh, which it most likely will be, you then either have a floating chair or a chair that's sunk into the ground. So if you have the pivot on the bottom and you swap the mesh, then you know for sure that it will stay stuck to the ground.
There's a couple ways we can do that. I have a page at the end of the presentation with some links that I'll share. But this is a SoulburnScripts package, which you may be aware of. Neil Blevins, who used to work for Blur Studios, has developed this over the years. I think he's finally stopped updating it, but they still work, even in the latest version of Max.
Tons of great tools here. I'm just going to show one, which is the pivot placer. Very useful tool. So if I select all the objects and look at this box here, I'm going to click the bottom green one and get close to one of these. And you'll see now that the pivots are on the bottom. So those should be in much better shape now.
And the final demo here is another furniture setup. This one, if I select a chair and say Select Instances, it selects all of them, which is great. Yay, they're instanced. However, if I open up the scale box and select an object here, you'll notice this one is negative scale. This one is positive scale. Negative scale. Negative scale. Positive scale.
So not sure if this was a side effect of how they were placed in Revit, or maybe in the export/import process, something happened. Things can happen along the way. That's why we're checking everything. But as mentioned earlier, this is going to cause some problems in Unreal.
So what we can do is select everything again and go to our hierarchy panel, and we can reset scale. So it's going to lock in-- well, basically, reset everything to 100% scale, but not change anything. So now, we select all these and they now say 100. Great.
But one problem that this will pose-- if we select them again and go to our pivot placer tool and try and reset it to the bottom again, you'll notice that some chairs do get the pivot at the bottom, but some chairs also get it now to the top, which, when we reset the scale on the negative-scale chairs, the pivot was oriented at a different way. So it's basically flipped upside down. s
We could undo that and try resetting transform on the objects and now do our pivot placement. And now that should have them all at the bottom. In some cases, it may be better to just prepare one object and then repopulate them so you know for sure that everything is identical, all scale transform values are identical. But just to show some things to be on the lookout for.
A few other things with preparing the model, which I won't demo, but just to talk through. Setting up basic materials and lighting. This is optional, more a matter of comfort and preference. You can set up a lot of it in Max and translate that over, but there's also a world where you'll do all of your materials and lighting in Unreal and only care about the geometry in Max. So it is, again, more how you're comfortable with. And I think as you work through this pipeline, you'll get more comfortable in doing it at the end, which, you'll definitely get more control over these things in Unreal. But they're certainly possible to do in Max.
As I mentioned, Datasmith will transfer most Max materials. The side effect of this is that the node trees can get quite messy, and ultimately, expensive to render them. It's trying to convert Max materials into Unreal materials. So it has to jump through a lot of hoops and do a lot of weird math to create the right shader that you're used to in Max, which, ultimately, can get computationally heavy. So just to be on the lookout of that.
My preferred method is to use standard materials with either just a diffuse color or bitmap applied. And these are more placeholders. But to give some visual characteristics that will transfer over and then give me a cue on, like, oh, this is my wood that I need to replace. It's not the right bitmap, but at least there's some visual cues rather than everything being gray or white. But that's a fine workflow too. I mean, if you're more comfortable replacing materials that way, but I like to have at least a little visual cue as I'm bringing this in and then and then replacing them down the line.
If you are more of a Max V-Ray user, there is a script which I'll link to at the end-- an optimizer script-- that can help reduce complex V-Ray shaders into a more simplified version that will transfer better to Unreal. They developed this when V-Ray for Unreal came out a few years ago, which doesn't seem like it's seeing as much play right now, but the optimizer script can still be useful if you are in a V-Ray workflow.
Datasmith will transfer a lot of lighting types as well, but I'd say more so than the materials, I would prefer to do lighting in Unreal. Maybe if you have positions of lights that you want to capture in Max, you could have just a dummy object or just a null for position that you could transfer over and then repopulate with actual light types in Unreal. That could be one workflow.
But typically, I'd like to do the lighting in Unreal though, it can transfer spotlights and point lights, and it will repackage V-Ray area lights into a blueprint class that is heavier than you need. But it will transfer a lot of lighting types. So just be wary of how performative some of these will be.
UV mapping and unwrapping. From an unwrapping standpoint, it's kind of your-- how detailed you want to make a shader on an object. Do you want to unwrap every piece of fabric on a chair and make it perfect, or is applying a box map good enough for what you need?
Now, I will say, applying a consistently sized box map to most every object in Max before exporting is a good kind of failsafe, because even if you don't think you'll be applying a map to it, having that flexibility down the line to apply a map that already has some base UV information will be helpful, even if it's a metal that you say, oh, this is just metal. There's no mapping to it.
But once you get in Unreal, you say, oh, maybe I want to add some grunge or scratches. But it has no UVs. There are some tools in Unreal nowadays, but you may need to go back into Max and apply some UVs and re-export it. So just setting that up from the beginning will just have them ready if you need them down the line.
Organization and export. So keeping things named and organized. I mean, it should go without saying for really any-- everything we do has a lot of objects and layers and names of things. So as long as there's consistent naming for your team, and that will just ensure that there's clarity and efficiency down the line. So however you want to establish that with your team. But it's just something to do and not have object 100 or line 2, because then you'll be wondering what this is.
And even for materials, I think naming objects, materials, textures, those are kind of the base ones, but it'll just make things more efficient and be clearer down the line. So similar to Revit, there's a couple of ways we can export from Max. Datasmith works well for whole scenes, whole architectural scenes. But I've like to use FBX for individual props and set dressing objects.
So with the chair example we were looking at earlier, I would bring that chair back to the origin with the pivot point set at the base center. So it's at 000. And export that one prop as an FBX. So now I'm managing that one asset as one FBX file.
And you have more control for making updates. If you had 100 different furniture types all packaged into one Datasmith file, yes, you can make updates to that too, but it's a little more cumbersome, and you're carrying this whole data set. And if you only need to update one chair, it's-- just finding the right balance. So that's the balance we've struck, is architectural scenes and groups of things, we'll use Datasmith, but individual props, we'll use FBX.
And then breaking it up into pieces. I showed the curtain wall example. So we broke them all up into several different pieces. We had the curtain wall. We had the structure on its own. We had the floors and core on its own. All the furniture was separated. And then some of the interior fit outs-- basically, every level that we were going to explore in the interior of, we had an interior level. And in the end, all of these pieced together. But again, it just goes back to how big of a data set we want to be carrying from one step to the next. And having smaller pieces will just make things easier to manage and update down the line.
So the final Max demo I wanted to show is going back to this scene. So let's prep this for Datasmith export. I'll make a note that prior to Datasmith plugin 5.1, you would find the export in the normal Export option here. It would be listed as a file type, as Datasmith. After 5.1 plugin, they moved it to the Max ribbon. So here, I have the latest version. I think it's 5.3. So it's on the Max ribbon.
You can still do direct linking, but I will say Export, which will export the whole scene. I'm going to pick [INAUDIBLE]. And we'll call this curtain wall 02, since we already have one. And then just hit Save. And that's it. No warnings, no errors. So it transferred cleanly.
While we're in here, just to touch on the naming and organization, this is a style guide we use where every mesh object had this sm_, which means static mesh. So just looking at an asset on the surface, just because it has sm_, you know it's a mesh. Materials would get an m_. Textures would get a t_. So that goes back to that naming organization and just staying consistent with whatever you're doing. And then from there, it just had an identifier, and then if there were multiple versions of it-- or instances-- you'd have a number. So that's all I'll say about the naming here, but it's kind of however you want to establish that for your project.
So another high-level statement here. In real-time, we want to achieve the highest aesthetic while maintaining the best performance. Kind of a lofty statement, and easier said than done, but that's what we're trying to get after. So in a moment, we'll show some initial Unreal Engine setup.
Version control, like Perforce, is a good practice to implement if you can. Even if you're the only one working on the pipeline, having a version control is still helpful to see history of changes and rollback if you need to. There are so many assets, when you get into Unreal, that you basically check out individual assets, make changes, and then push those changes. And you have a history of every asset.
So if you wanted to roll back, say, one material change that you made two months ago, you could do that just on that material. And having the version control system is what makes that possible. So definitely something to look into if you haven't used it before. But we use Perforce for this project and for all of our projects. And even if the team is small, one or two people, it is an essential part of the pipeline.
A style guide, so what I mentioned earlier about the naming. I'll link to one at the end that we use, but there's others out there. It's just kind of finding one that works for you. Quality control. Don't save this for the end. Make this part of your iterative process. Test at every step of the way where you can, and always going back to that quality versus performance, balancing that. How good can I make it while still being performative? And that's why we test along the way rather than wait until the end.
This is something I touched on at the beginning, but blocking and optimization is kind of our workflow method that we use to use rough implementations as placeholders to allow for rapid testing while final models are being finalized and optimized. The cleanup on these data sets is not a quick process. I won't sugarcoat it. It takes time and it can be tedious. But your client, or even internally, you may want to see results sooner than that. You can't wait three weeks for a file to get clean where you need to start exploring cameras and lighting.
So using a rough, blocked-out version, unoptimized, like maybe you collapse the building down and bring it in and apply some basic materials so you can do some cameras and lighting studies while the model is being prepared. It's kind of finding those two parallel paths so you can make progress on things you can while the assets are being finished.
So basically said all that already, but these are just kind of different strategies, whether you're working with the architecture or the set dressing, which, again, using placeholder set dressing is great. It may be they don't even have final furniture models selected.
So just using any old chair that has pivots applied correctly that we can swap out down the line with the final models rather than not having any furniture at all. Then it will just-- you won't get the same sense of density and scale. You want some placeholders in there. So even as you're walking around, oh, I can navigate here because there are chairs here, versus if you don't have any placeholder, you may assume you could walk somewhere that may be obstructed otherwise.
I know we are running a bit long on time. So I'm going to try and do these as quickly as I can. But jumping into Unreal, they make so many updates to every version, I think understanding what version to use to take advantage of the features that you want is critical for this. We'll be using 5.2. 5.3 was just released, but we'll be using 5.2 for this. So I'm going to launch the engine, attempt to do this demo quickly and live here working with the data set-- the Datasmith export that we just generated here. And I'm going to go through the steps creating the project kind from start to finish.
So if you've never used Unreal before, you'll be brought to this preset menu. There's different presets here for games. If you want a template that has first-person functionality, you can remove the gun, but you would still have the ability to walk around. That blueprinting would be enabled. You could use that. And there's other ones here. For us, we're going to use architecture and the blank template. There is an ArcGIS template, which has some more starter content piped in.
But we don't care about having that, and it's just going to weigh down our project. So we're to use the blank template. What this does have enabled is the Datasmith plugin and the sun and sky blueprint plugin, which, if you, I believe, were to use a games template, you would manually need to enable those plugins. But with this architecture template, those are already enabled, so we can just hit the ground running.
For our project, I'm just going to call this AU23 demo. Create that. It should take a matter of seconds to launch the project. All right. So pretty bare bones template, which is fine for our needs.
The first thing I'm going to do is create a new folder. Going back to staying organized. We want things to land in the appropriate place from the beginning. So I will create a Datasmith folder, because we'll be bringing in a lot of Datasmith files. I'll also create a Maps folder, which, in gaming lingo, maps are levels. So all of our levels will go in here.
I said that the plugin was enabled with the template. So when you go to this plus sign up here, this is the import-- or add to project. You get Datasmith as an option. So file direct link. We're going to do the File, Import. We are going to find this CurtainWall_02 we just exported a few minutes ago. Open that. We're going to say save it into the Datasmith folder. And so it will save the assets in the Content Browser here. But it will also add them to whatever level is currently open.
So we hit OK. We want geometry and materials. That's fine. We don't want lights and cameras. In the advanced options, there are some lightmap settings, which, we're not going to cover that at all. As of 5.0, Unreal introduced Lumen, which is its kind of global new global illumination and reflection features that basically eliminates the need to light bake. You can still light bake, and there's certainly instances where that is useful but we want to be able to just render this in real-time without having to pre-bake anything. So we're not going to care about that. But this is where you could generate a UV light map channels for your meshes.
I'm hitting Import. And this should happen almost instantaneously. You'll see that it added a Datasmith actor to our level. And if we open that, all of our meshes here, which, if I select these, It should be 48, which was in Max. So we have 48. The structure here-- I'll show you, there's this Datasmith scene file, which gives it all the information on what objects go where and what materials are how materials are applied.
In the geometry, here, you will notice there are 16 because we had instanced several of these sections. So we have our stack zero glass, our stack zero metal, and so on. But so there's only 16 unique meshes referenced here, but that propagate 48 different actors within the scene. So this is kind of where that instancing comes into play. If you had an instance, then you would get 48 objects, 48 meshes, for the 48 actors, which is not efficient.
Quickly, on materials, in Unreal, you have a two-level kind of material hierarchy. You have material instances and then you have the master materials. These instances are all of the unique shaders that were set up in Max. But with the material masters, what it tries to do is create-- if they have the similar properties, meaning this metal door, the mullions, and the shadow box, the only thing unique about them is the diffuse color.
So what it does is creates one material master for those three types, exposing a parameter to change the diffuse color. And I'll show that here. So there's five materials here, but there's only three material masters. If you right click on any object, you can open what's called the Reference Viewer and see how the downstream-- I think I need to save it for that to show up, I believe.
Wonder if that's bugged. But anyway, this material master of the mullions feeds into the three instances of the metal door mullions and the shadow box. The reason that you have separate ones for frit and glass is because those had different parameters. When I created the temporary material in Max, I had lowered the opacity on the vision glass just to give it some see-through quality. So it identified that as a different parameter, so it had to generate a different master material for that because so it could control opacity.
And then similarly, with the frit, it actually has a pattern applied to it, which I think you can see-- yeah, this dot pattern. So that is a different property as well. It's not a diffuse color. It's not an opacity change. It's a bitmap. So it had to say, oh, that's a different material type. So that's why you get three master materials here that ultimately propagate to five instances, meaning that these three metal ones all share the same parameters.
I know we are long on time. So I will do one more quick thing here, just to show that we can import multiple Datasmith files into the same level here. So I have another file prepared. It's this core structure from our actual project. I'll put it in Datasmith, import. So now, we have two Datasmith scene actors that we can manage. We can turn off the shell-- curtain wall there. And then we have our core and structure.
And all of these went into that template we created, which had some basic environment lighting already set up. So even with the transferred materials, they're not great. They're not-- the glass is not glass. But just getting some general shading on this is enough to assess the model.
This is what I was saying, where you would bring in that temporary model and take a look at it for any issues. This would be a really easy workflow to a quick datasmith export, import it here, and bring it into one of these templates and just look at it and see if there's flickering, if there's holes, light leaks, any black surfaces, things that are inside out, all those things we talked about earlier.
Going back to our original statement, what is the end goal? And what are the export needs of the client? So this is a screenshot of the actual leasing center with that video wall displaying the experience. And this is one of our employees, Tim. So you can see the scale of this thing is massive. It's a 4 by 3 grid of-- or 4 by 4 grid of panels rendering at 4K output.
Just want to touch on a few things here. I mean, we could talk for days on any one of these topics. So this has been kind of a crash course. And hopefully, kind of given a good overview, but we can go really deep into a lot of these. So Sequencer and Movie Render Queue are a couple other things within Unreal Engine that you can use to generate stills and animations. I'm not even going to show those, but there's something that you could look into on your own. But that's how you would create camera paths and render out those camera animations and sequence those cuts together and render it as a movie or image sequence.
Interactivity. You're in a real-time game engine, so you might as well take advantage of interactable opportunities. And then the final testing. There's testing along the way, but once you create the executable and put it on the final hardware, and you're really going to put it through its paces and test it, you'll probably notice some bugs and some issues that you'll need to go back and clean up. But that's something that we did on site here at the leasing center once it got up on the screens.
And there's things that we hadn't anticipated, because we were just looking at it-- while we had their hardware, we didn't have the video wall installed yet, so we just had a regular monitor. And then once we see it on the big screen, it's like, oh, that exposes some problems that we hadn't considered. So doing that final on-site testing.
The last thing I want to show is the actual experience. I apologize, this is running through a remote connection over a screen recording, so things may not-- I probably should close some of these, because-- it may be a little less performative, but at least want to show the final product. So there is a film component, but we'll skip that and start the tour here.
So with this, we had implemented this auto-tour functionality. So these are pre-programmed camera paths within sequencer, but rendered at real time. These aren't pre-rendered. I'm not controlling anything now. But these are past the client wanted that they would, in the leasing center, be speaking to. But at any point, you can break out of it. And now, I have control. I'm just on a mouse and keyboard. But yeah, using a gamepad is probably preferable for this.
So you can kind of walk in. You'll notice, we get closer, the doors will open. Those are things we had programmed. Those are some of those interactables. As you approach other objects, you see this hand. What is that? Oh, interact. Click. Hey, the screen wall animates over. You can close the restaurant off to a private event.
So finding moments like that, they're simple, but they're fun to interact with. And then at any point, you can rejoin the tour to the closest approximate point of the path. And now, I'm not controlling it again. You can speed this up. There's a way to go 4x speed if you want to, or go 1/4 speed. You can go in reverse. So these are all things that we developed for this.
With that, I will exit that. So here are some of those links and resources I mentioned earlier. Here's a link to the Datasmith export plugins. There's lots of different software you can install this for, but the Max plugins will be there. This isn't something I showed here, but SINI software offers a great suite of plugins, Forensic being one of the free ones.
You can pay for more, but Forensic is a good kind of scene checker utility that will search your scene for any problematic elements and objects, which is great for assessing. You run the Forensic utility and it will tell you all the bad geometry and materials and give you ways to quickly clean those up. Here are the SoulburnScripts that I showed the pivot placer for. But there's dozens of other scripts. This is that V-Ray optimizer, material optimizer that you can check out. This is the style guide that we use for this project, but there are others. It's just to be aware of them and kind of finding what works best for you.
The Epic Games dev community is a great resource for learning Unreal. And the community is very helpful, and there's a lot of great content out there. This links directly to a Revit-to-Unreal Engine workflow from a couple years ago. There's certainly things that still apply here. So if you want a more in-depth, multiple-hour deep dive, that's something to check out.
And then lastly, the CG Garage Podcast. You may be aware of it. It is put on by Chaos, but they're not just touting their own software. It's more of an industry talk. And they speak with a lot of VFX supervisors from different companies. And it's more like interesting industry talk, so something that I like to put on perhaps if I'm going through a manual collapsing marathon, where it's pretty mindless, I'll put on a couple podcasts and listen to those in the background.
So I'm sure you all have your own kind of multitasking distractions for some of those mundane processes. But that's one that I like. So thank you. Thank you for attending and watching. I put my email up if you want to get in touch. Certainly, feel free to reach out with any questions. And yeah, hope to see you at the next one. Take care. Bye.