说明
主要学习内容
- Learn how to optimize a 3ds Max scene for real time and VR
- Learn how to use the interoperability capabilities between 3ds Max and 3ds Max Interactive
- Learn how to enhance visual quality and build interactions into your experiences
- Learn how to deploy your real-time experience for a virtual reality platform
讲师
- Bruno LandryBruno has been in the design visualization industry for more than 15 years, working as a 3d specialist freelancer, for creative solution firms and large manufacturing companies in Montreal. After initially studying visual arts, he then completed a bachelor’s degree in industrial design where he developed a passion for computer graphics, photorealistic 3D rendering and a sensibility for beautiful things. After years of producing and managing productions of 3d rendering and animations for architecture, consumer products and transportation design, Bruno joined Autodesk to be involved in the development of real-time technology workflows in the design space. He’s now a charismatic Product Owner in the 3ds Max team, leading a successful development team in the right path.
- Jose Elizardo3ds Max Technical Specialist for the Media and Entertainment division. With over 15 years of industry experience, Jose’s mainly focused on evangelizing and promoting 3ds Max to both the entertainment and design industries.
JOSE ELIZARDO: Everybody, and thanks for-- whoa! Crazy loud mic. Thanks for making it out to our class. I Think people are still-- we should probably lower that a little bit, huh?
AUDIENCE: OK.
JOSE ELIZARDO: I'm like trying to talk away from it. People are still trickling in. So welcome, everybody.
My name is Jose Elizardo. This is my good friend and colleague Bruno Landry. And we're going to talk to you guys about virtual reality.
So we're going to jump right in. We have a pretty jam packed agenda. I'm going to introduce Bruno. Bruno's going to introduce me, and we're going to jump right in. Sounds good?
Fun. All right, so Bruno-- Bruno's with Autodesk. He's a product designer on the lab design team.
He focuses more on VR these days. He's been with the company about 2 and 1/2 years, 2 years. Prior to that, he was in the visualization field, and prior to that, he has a background in industrial design.
And I thought it would be kind of cool to go over a little bit of Bruno fun facts. There's a bit of music that you can't really hear. You want to pull up the volume a little bit?
So a little bit of fun facts on Bruno before we get started. Bruno is an avid cyclist. He does illegal cycling activities that he talked to me about earlier today.
I'm not exactly sure what, but he likes to do lots of cycling, 48 hours cycling with small periods of rests, and that kind of stuff. He's also a big fan of Bruno Mars. And a fun fact about Bruno is that his last name, Landry, is one of the most popular names in Quebec where we're from back in Canada. And funny thing about Landry is if you add a U, you get laundry. Bruno [INAUDIBLE].
BRUNO LANDRY: --mic? It's-- you kind of set the table, right? Of course. You guys are in for a ride.
Wow. So let me tell you about Jose. So we can learn from his Facebook profile that he started at Autodesk about 10 years ago, right? 10 years, two weeks ago as a QA in the 3DS Max team.
He moved on as a technical specialist for 3DS Max. And fun fact about Jose-- on a Saturday afternoon, you like to light baking, which I think is pretty interesting. So this is our agenda for today.
It's going to be quick and simple. We're going to show you how to go from Max to VR in three minutes. After that, I'm going to end the clicker and the mic because we're going to have to do mic juggling. It's fun.
Jose will tell you how we can build and create VR experiences. And I'll take the mic and the clicker back, and I will show you how we can create-- and in fact, how can we use VR as a design tool for productivity with 3DS Max? So we'll start right off with this. So back in June, we released 3DS Max interactive.
So just so you know and to avoid any confusion, basically, we took Stingray, the game engine, and we rolled out in 3DS Max, in the 3DS Max family of products, OK? So we might see at some point kind of-- even for us, it's sometimes difficult. So basically, Stingray and Max Interactive, as of today, it's the same thing.
But you'll see by the end of this presentation that at some point, Stingray will be more like intended and directed for games user, as 3DS Max Interactive is going to be for anything VR, real time specific, for 3DS Max users. So you'll see that we are building and creating new workflow specifics for real time in ArchViz in real time in VR for 3DS Max user. Everybody is OK with that?
All right, so it's behind us. So as you can see, VR started a while back at Autodesk. And so I'm going to start with this video because it's funny.
So this is about 25 years ago. And I don't know if any of you have heard about Project Cyberspace. So this is 25 years ago. And it's funny because this is kind of me with my team but 25 years ago. So I don't know if any of you have tried, like, bad VR experience.
Like for this one, I guess that after a few minutes, you may need a brown paper bag. It's-- I don't know. It looked like-- yeah, pretty bad.
But where are we now? So I guess we kind of-- like, all that technology problem is kind of behind us now. I mean, look at this guy-- not only is using Samsung Gear VR, LDAR, [INAUDIBLE] surround, but he's having not one, but two milkshakes. So I think-- I mean, most of the technology problem is now behind us.
So let's jump into it. I'm going to show you how to go from 3D to VR in three minutes. So get ready because it's easy. I mean, everybody can do VR, and I'm going to show you.
All right, so just a little background story for this. So this is-- in fact, it's my kitchen. So it's my own and for real. So when we purchased the house, we needed to do some remodeling so we can tear some wall down-- so redo the kitchen, the whole thing.
And I mean, I was a Max artist. And I mean, I had to do it in Max, right? So it's kind of a given.
So in that case, the stakeholder for this was me and my wife. So we just wanted to see, like, the layout of the kitchen, the materials. Did we make the right choices? And when we kind of started to do that class with VR, it's kind of a funny story, because we were purchasing new doors for the house.
So and I say-- so which should [INAUDIBLE] in fact, we were wondering, should we go with super modern black frame inside and out? And we were like, eh, not sure. And it was like, VR would be the perfect use case to test that.
So that's what we did. So in this scene, I mean, lots of [INAUDIBLE] motion model, V-Ray material all over the place. And I made that scene two years ago. Nothing was really like meant for real time in VR. So how fast can we go from this to the real time engine without any manual work? And that screen is--
JOSE ELIZARDO: [INAUDIBLE].
BRUNO LANDRY: --not cool. OK, I'm going over here. So on the right, this is a 3DS Max interactive scene.
It's an empty level. I'm using a VR template. And I'm using a level sync to send everything in my Max scene straight in the real time engine. And in about, I guess, three minutes, I will be able to navigate that scene in VR.
So as you can see, straight out of the import-export process, this is what I have. So the goal is not to have something super fancy. It's just how fast can I go in VR.
So the only thing I actually needed to do is to select those, my floor. And I need to add physics. So I will be able to teleport on those surface.
So that's the only thing I need to do-- add static physics. It's done. I will hit play-- boom.
And I should be in VR in three minutes. And-- et voila. So the goal of this, it's not to have like a beautiful, interactive experience. It's like, can I use this as a design review for me, for my stakeholder?
In that case, it was me and my wife. And you can just teleport in your scene. Again, you leverage everything you already have done in 3DS Max. We keep the VR-- the V-Ray materials. Everything is kind of automatically translated
I did not do any optimization or whatsoever. It's straight from Max to VR in three minutes. So I can just leverage what I've already done in Max and just navigate in my design and see does it make sense for me. And funny story-- turns out that those contractors were supposed to install our door yesterday, and they did not show up.
So I was expecting to have new doors and window on my way back. But that will have to wait a few weeks. That's awesome. But I still can go in VR and watch those amazing doors. It's awesome.
So if that level of quality is not good enough, I mean, for that scene, I took the same-- I mean, at the same step I was, I just light [INAUDIBLE] my scene. It took about 30 minutes. And this is the level of quality I can reach if I just add like another half hour of love to that project.
And I mean, it's not me doing something. It's basically just the computer calculating the light map. And so after that, I'm going to hand the mic and the clicker to Jose, and he will show you how to kind of reach that level of quality and also how to create some interaction and create a real accompanying VR experience. And as you can see, we like Shania Twain at home.
JOSE ELIZARDO: So what Bruno mentioned in that particular video is that those camera animations you saw at the end--
BRUNO LANDRY: Oh, it's true.
JOSE ELIZARDO: --that's just a screen capture in Camtasia and just moving your view. That's all that is.
There's no cameras, no animation. It's super simple, and that looked fricking killer.
BRUNO LANDRY: It's quick and dirty.
JOSE ELIZARDO: Quick and dirty and looked fricking awesome. So why is nothing on screen?
BRUNO LANDRY: Hit Play next.
JOSE ELIZARDO: Oh. All right, so my goal in the next 45 to 50 minutes or so is basically to take up with Bruno left off and to create a more kind of complete VR experience, make it look good, go over all those features and workflows, and create interactivity behaviors, events, triggers, that kind of stuff. So I want to take a moment to thank a company out in Montreal that lent us some of the datasets you're going to see called Alpha Vision.
They're a visualization house. They've been around since '93. They do a lot with AutoCAD files. And the file they give me an AutoCAD file that was brought into Max, rendered in V-Ray.
So, big shout out to them-- you probably know them, David. Yeah, like to put you on the spot. So this is the dataset.
Wow, this is dark. I don't know if we can do anything about that. But that's super, super dark. Anyhow--
BRUNO LANDRY: No, it's beautiful.
JOSE ELIZARDO: It's beautiful. So this is the Max-- by the way, I want to ask you guys. Who uses Max in this room? Whoa, resounding.
Who uses Max Interactive, Stingray, whatever you want to call it? A few people. And by the way, I'm going to say Max Interactive.
I'm going to say Stingray. I'm referring to the same thing. My mind still hasn't caught up, but just know that when I make that mistake, it's the same thing.
So this is the scene they gave us. This is a scene that's like I said, was meant to be rendered in V-Ray, which it was. And they actually built this for a home builders kind of event down in Florida where they attended a couple of months ago.
They gave us that, and we built this with it. So this is a full interactive VR experience built with Max Interactive. There's a lot of stuff in here. There's a lot of interactions. There's a lot of triggers.
There's a lot of different kinds of things. Not everything is going to be applicable to everybody in this room, but the idea is for you guys to think outside the box, abstract from what you're seeing, and use these workflows for the types of things you want to create in your experiences. All right, so you see we have these kind of pop up icons that appear when I approach a certain location. The sound is like really low. There's no way we can put that up?
BRUNO LANDRY: [INAUDIBLE]
JOSE ELIZARDO: Can you guys boost up the sound somehow? No? Because I don't hear anything and I should hear some sounds.
So I have these pop ups that appear that tell me that I can interact with stuff. I can laser point on them. I can create unicorn wallpaper.
I have a little wall plate on the-- music plate on the wall. I can turn that on and add some music. And adding music, It's a big deal. It really helps.
it's spatial sound. It's all based on Wwise, which I'll show you guys a little bit later. But it's all 3D sound.
So it reacts to where you are in the room and changes the sound from left to right. We can create these kinds of animations with furniture swapping. We can even have a Fred Flintstone tape-- super out of scale, but whatever. It's OK. We can also pretend that we're in the "Stranger Things" movie. Who's a "Stranger Things" fan?
AUDIENCE: Woo!
JOSE ELIZARDO: All right, come on. All right, so yeah. So like I said, I won't let the whole thing play back.
Material swapping-- we're going to go over material swapping, animations, sound, audio, and how to set up teleporting and how to grab objects. It's basically what I'm going to show you guys.
But before we get to that, we're going to build it up. There's no audio. I don't know why. It's-- anyways, Shania Twain is playing for those who don't recognize that.
Anyways, let's keep going here. So what I did is I separated out my section into three main kind of buckets. The first is content.
So we got to get content. And I have to show you guys how to bring content in, how to deal with content, how to optimize it, and how to bring it into the game engine. And then we're going to make it look good.
Camera ready-- make it look pretty. And then we're going to build out some interactivity. There's three buckets of workflows that I'm going to show you guys.
Cool. So the first thing is how to get your data in, how to massage it, how to work it, and how to get it into your project-- how to get it into the game engine. All right, so the first thing we're going to do is start off with a temple.
That kind of game engine, Max Interactive is built on templates. This is the project manager. So you have your projects that you're working on or you have templates.
I'm going to grab a HTC Vive template because I'm not putting to the HTC Vive. That's the device I work with. By the way, everything I'm showing you is applicable to the Oculus. It's pretty much agnostic device-wise.
It's just really the output that you're doing. Give it a name, give it a pop on disk, compile. And this is now the foundation that I can start bringing things into to create my own experience.
This level that you see here-- a level is kind of like a scene file, so to speak-- is the VR learning level. It's got a few pedestals with some touch points and interactions. You can basically learn how these things were built and reverse engineer them onto your own assets if you want that.
I'm going to start off with a new level and bring data in. So this is my Max scene. And the first thing I want to mention is that however you set up your hierarchies in Max, whether you're doing grouping or whether you're doing parent and child relationships, all of that hierarchy that you've done in Max will get maintained when you send your content into Max Interactive. All that will remain following the workflow that Bruno showed a little bit earlier.
So here I have this table, and I have these objects that are parented to it. And I'll show you guys how to set that up. So I have these assets on my other viewport.
I'm going to use the select and placed tool to basically drag and drop those assets onto my table and create a parent and child relationship. So when I grab the lower book, everything comes with it. If I grab the table, everything on the table comes with it.
And this basically is considered one object when I send it over into Max Interactive, all right? So it's a way to sort of organize and optimize your scene, so to speak. We'll come back to some optimization stuff a little bit later.
So the next thing I'm going to do is once I've kind of massaged my groups and my hierarchies in my relationships is I'm going to-- actually, another workflow that I want to show you is how to leverage the Asset Library as a tool to replace objects in your scene. So I have these stand-ins around my island.
And I'm going to open up the Asset Library. The Asset Library, if you're not familiar with it, it's free on the Apps Exchange Store. It's basically a way of centralizing your content, your assets, your libraries.
Basically, you drag a folder over onto this left hand side. The contents of that folder appear in the middle. And when you select an asset, all of its information, metadata, and thumbnails appear on the right hand side.
So from here, you can grab assets and bring them into Max in lots of pretty cool ways. The one that I particularly like is replace right here. Set yourself to that mode.
Whatever is selected in Max will get replaced with whatever I drag and drop from the Asset Library. I got my stand-in stools, drag and dropped them, and replaced them. So is a really cool workflow to leverage if, for example, you're working on a scene and a designer hasn't decided on the final look for that stool, for example.
But you still want to do your layout, create your standards, and then, when the designer is ready, drag and drop it in. Cool? So now, we're going to send it in to Max Interactive, play. I'm going to go to level, send all.
This is what Bruno showed you a little bit earlier. This is basically the sync dialog. This is where you select what you want to sync your Max project with your Max Interactive level, with your materials, textures, cameras, lights.
And also, I'm want to make sure that my generate UVs is enabled. This is going to create my secondary UVs for my light beacon. I'm lazy. I don't like to unwrap.
So I leverage the unwrapper and hit Send and go. Now this is, of course, hyper-accelerated. This took about an hour or two to come through. But that's an investment.
It's a good investment to make because the opposite or the-- doing this manually yourself, exporting all your assets and rebuilding your level, can take hours, if not days, of work, and even unwrapping. So I leverage all of this sort of automatic process to recreate my level inside of the Max Interactive engine.
I can connect the cameras, Max's to Max Interactive's. So the views are connected. As I navigate in one, I see the same thing in the other. And notice that all of my assets came through as individual units and all of my groups and hierarchies, of course, were also maintained.
Right, so it's not one big blob of geometry. And also, this scene was meant to be rendered in Viewer. These are all Viewer materials. Everything came through the Viewer-- the shading, the textures, everything.
Cool? All right, so the next thing I'm going to show you right here is- I'm not entirely sure. Let's get there.
It's coming. It's coming. Oh, instancing-- so another thing that the level syncing workflow or feature that we developed supports is instancing. So if I create these kind of randomized instances of the chairs, select them all.
And this time, I'm going to go to level send just selected, not the entire level, not the entire scene, and hit send. That was not accelerated. That's instant because that chair already exists inside of the Max Interactive engine. All I'm sending over is positional and rotational information.
And then Max Interactive is instancing it in its level based on the asset that it has already. You see that if I select any of these chairs located in the project browser, it's all pointing to the same chair. Cool?
So instancing and hierarchies come through is basically the message there. The next workflow here-- how to leverage our online Asset Library. So you've got your content. It's in Max, from Max, and it's inside of Max Interactive.
But you want to embellish it. You want to add stuff. So what we have is an online Asset Library.
So from within any project, it's located right here in the left hand side, your asset-- online assets, different categories in there. You got environments, assets, units, materials, scripts, lots of different things. And we're building this thing out.
Select any asset, hit the download link, and it basically gets copied and downloaded into your own project folder structure. So now at this point here, I'm going to start just drag and dropping objects right in. They snap to surfaces.
The drag and drop from the project browser snaps to surfaces. And very quickly, I'm going to do [NON-ENGLISH SPEECH]. What's that in English?
BRUNO LANDRY: Landscape.
JOSE ELIZARDO: Landscape! Sorry, my French is kicking in there. So we're going to do a little bit of landscaping. Of course, this is a little bit accelerated as well. But the point is that within just a few minutes, we can create these kinds of nice sort of vegetation type exterior environments really, really quickly.
All of these vegetation assets, they're my favorite from the online store. They all have vertex animation on them. So all the leaves are animated automatically.
You can adjust the wind speed if you want after on the asset itself. So as you see here, I have this little shrub here. I'm just adjusting the wind speed, and it wiggles a little bit faster. It wiggles a little bit slower.
BRUNO LANDRY: They are all real-time friendly.
JOSE ELIZARDO: They're all real-time friendly. That's the other point I want to make. All these assets are ready for the game engine, right? They're not just like weird assets that live on Turbo Squid or-- I love Turbo Squid, but they're not real-time ready, right?
These are all Stingray ready, Max Interactive ready. So we also have vehicles. We have a whole library of vehicles.
And again, I like to show this because it's a car-- you know, lots of cars online. But these are not-- they're not black boxes. If you want to make a Barbie car, you can make a Barbie car.
And of course, everyone needs a trashcan, and everyone needs trash bags. And you can send these over into Max if you wanted to to modify them and make them a little bit more unique, not so stock. You can do all sorts of things. What I'm also going to do is show you guys a scatter tool to create grass. So what you can do is from any location in your project browser down here, right click and create a scatter brush-- create a scattered brush.
In this scattered brush, I'm going to drag and drop the assets I want to scatter with-- so two different graphs models. Access to Scatter Brush from the left hand tool palette. Select the brush you just created and start painting. And it's super simple to do.
Anybody can do this. It snaps right to a surface, and you can create really nice looking grass really, really fast. And I'm leveraging content from the Asset Library.
It doesn't have to be that. It can be your own grass models. The point is that you can do this really quickly.
What's really cool too about these models and every vegetation model is that they're hooked up to what is called the global shader, the global wind shader. This guy controls the wind speed for all of your assets if you drag and drop it into your level. Rather than tweaking the wind speed on a per-asset basis, you tweak it on this guy, and everything that is listening to it in the level will update. So it's one way to just have one constant wind speed and have of your objects behave in the same way.
Drag and drop it in-- now all of a sudden, I have a little bit of wind. And I can increase the turbulence on this guy and increase the speed, increase lots of different parameters on him that you can adjust. This screen is really black. It's not good.
AUDIENCE: As a landscape architect, that's a [INAUDIBLE].
JOSE ELIZARDO: I'm really happy to hear that As a non-landscape architect.
AUDIENCE: [INAUDIBLE]
JOSE ELIZARDO: I'm happy to hear that. All right, so let's keep going here. So cool grass-- it looks good.
It's animated. It's super easy to use, and it's a global effect. So it affects everything. The next kind of workflow I want to show you, I think-- let me get there. Come on.
AUDIENCE: Just enjoy the grass.
JOSE ELIZARDO: Enjoy the grass. Enjoy the grass-- bathwater-- so who wants to make water shaders, all right? Let's go online and grab one.
It's on the Asset Library. Drag and drop it into your scene apply to the surface.
Of course, it's like a really windy bathroom, but whatever because you guys get the point. What I'm also going to show you before I keep going with this-- pause this real quick-- is so you know, you bring an asset into Max Interactive. You build them out.
They've got animations. They've got textures. They've got material definitions. They've got a bunch of logic built into them.
You don't want to rebuild this from level or project to project. We have the ability in Max Interactive to export what we call smart assets. So you can right click on any object in here, export. It's going to export as zip file that you can then important to any other project. And all of its dependencies come along for the ride.
So I'm going to import a rubber ducky because what's a bathtub w without a rubber ducky so let me go find that guy. We have a rubber ducky. It's got an animation clip, a control or a skeleton, materials, everything. Drag and drop it in, snaps to the surface.
And it's all ready to be used. What I can also do is take this guy once I copy it a few times and scale it out. What we can do also is our interop story between Max and Max Interactive is not one way. It's bi-directional.
So I could take content from Max interactive, push it into Max, modify it, and push those modifications back into Max Interactive. So what we're going to do here is right click on any assets you send to 3DS Max. It opens up the asset. I can make some modifications too.
The animation came through everything. And this is tame. I was going to go much Wilder. But he stopped. He's like, guy, stop, stop.
So he basically hit Update. I don't know if you saw that. It went a little fast.
Hit Update. It sends it back into Max interactive. You accept the import.
And then it basically overrides everything in the project. Now don't forget, this guy is playing back in real time. And I just wiped everything from below him, and he just kept going. So it's super robust workflow. It's a super robust system.
Let's keep going here with some people. Everyone wants people in their projects. I don't necessarily think that having 3D people on a VR experience is all that great. It's really weird.
I've done it. It's really creepy unless it looks super realistic. But if you do want people, this is how you do. Or at least with Populate, this is how you do it.
So I'm going to grab a Populate seeded character. I run the simulation. I have an animation over 300 frames.
I can modify that if I want to. If I don't like her animation, I can re-simulate and cycle through different animations. And I can make her high res by regenerating the high res mesh and the high res textures.
Once I'm done and I want to send this into Max Interactive, I basically hit Bake Selected on the main toolbar. Creates a bone rig.
Select entire thing. Go to Export, Export Selected. Make sure that-- give it a name, of course. Click.
Give it a name. Make sure that bake animation is enabled and the appropriate frame range is defined. And hit go. So now I can bring this guy in or this girl in.
I have an animation clip. And she's ready to be used. Now there's a lot of different ways of triggering these animations.
I don't want to create an event. I just want her to loop that animation on continuously always in engine. So what I'm going to do is right click. And I'll show you guys events and triggers a little bit later.
This is just looping animation always. So I'm going to right click on the skeleton, actually place her first-- drag and drop her in, rotate her, put her on the bench
And I'm going to right click on the skeleton and create an animation controller. In here, I can double click on the controller, delete the empty node that's there, and create a brand new clip state and assign that animation clip that came through to the clip state and basically hit Save. Boom-- and we can even bring in multiple people at the same time, one single object.
This can be a crowd of people. Drag and drop them in. Cool? People standing next to a pink car.
BRUNO LANDRY: Yeah, but this could be like any--
JOSE ELIZARDO: Oh, by the way, yeah.
BRUNO LANDRY: [INAUDIBLE]
JOSE ELIZARDO: It doesn't have to be Populate. I use Populate because it's in Max, but this workflow is applicable to any mesh with any bone rig you bring from any location, right? It's the same workflow for everything. In fact, it's the workflow for any animation for that matter.
So let's look at how to make things look pretty and beautiful because what's VR, what's an experience, without something that looks good? So I got my project in. The other workflow that I didn't show a little bit earlier with the level syncing is we also support lights.
So this thing's got a whole bunch of artificial lights that I just unhid. I'm basically going to select all of my lights and all of-- you know, it's got a bunch of parameters set up to create the specific fall off effect that I want. Select all my lights, go to Level Send, Selected.
Make sure Lights is enabled. Hit send, and it just sends it right through. And all of the settings that were on that light-- play-- have been propagated in.
In fact, Bruno's going to show you later some stuff even more cool with regards to lights. I'm not going to steal your thunder. But speaking of lights, that's the story right now. Lights come through.
Regular Max lights, spotlights, and directional lights and omni come through, and all of their parameters come through nicely. And we also have a sunlight that's inside of the template. That's part of any template.
The sunlight exists. It's basically behave like a direct light, so to speak. So I'm going to modify my direct light, orient it a little bit to create a more dramatic effect with the light coming through the windows there.
But you'll notice that I don't have any light sort of shining on my ground here. Like, here, I'm just changing the light color itself. The reason for that is because my glass panels themselves on the windows are casting shadows right now.
So no light is actually coming through. So to modify that or to fix that, I'm going to select my light object. And I'm going to open it up in the Unit Editor.
The Unit Editor is basically a glorified Properties editor. There's lots of things in here. We're going to come back to this quite a bit today.
But basically, in here, I'm going to select my lights and make sure that Cast Shadows is disabled, basically save that. And now I have light coming through and shining on my ground, all right? And I'm just going to really specifically or more precisely rotate it and position it exactly where I want it to be.
And then the next step here is to deal with the environment. I don't like the cloud image that is provided with the template. So I'm going to replace it. I have a panoramic image that I brought in from Photoshop that I'm basically going to just drag and drop and replace my sky dome image with. It's literally just a drag and drop. And the next thing to do is to deal with light baking.
So the effect that I want to create is that I want all of my artificial lighting to be baked-- direct and indirect lighting, OK? So I'm going to go ahead and select all of my artificial lights and I'm going to make sure that the Bake setting is set to direct and indirect. It basically turns them off in the viewport.
But once I initiate my light bake, I'll be baking that lighting information down onto my assets. My sunlight I'm going to handle a little bit differently. Because I have moving objects and because I'm going to animate the day and night, I want the direct lighting to stay dynamic and the indirect lighting of it to be baked. So I'm going to set that up. And I'm also-- not sure what happened here-- going to the light baker panel and start setting up my properties.
So the first setting here is your light map resolution-- so how big your light maps are, how much quality will they have. The bigger they are, the longer they'll take to compute, the nicer they're going to look, but the more memory they require. So there's this big kind of cost trade-off that you got to calculate in your minds.
But basically, that's what that does. The light picker is a progressive refinement solution. It's GPU accelerated.
So it basically progresses over time based on the amount of passes that you give it. So the more passes, the better quality you'll have of a light bake. And then your irradiance intensity is basically how much influence from the sky dome or how much the sky dome is lighting your scene. Think of that as sort of image-based lighting.
So hit Light Bake, and boom, it starts to refine. Now this scene is really big. It's really complex, and it's not super optimized.
So it did take a couple of hours to bake. But once you're baked, you have a pretty nice looking result. You can visualize some diagnostic type views in your viewport.
You can turn on your textual display. This is basically light map resolution itself. So you can see how dense they are. Or if there's any stretching and skewing, you'd have errors in your secondary UVs.
And then you can also turn on your diffuse lighting, which is basically just the lighting solution itself stripped away from textures and reflections. You can see what it looks like there. And another workflow that I'm going to show you is how to optimize assets for light baking and for dynamic light calculation.
So this is a super dense sofa, super high res, that was not optimized at all. And what I'm going to do is basically create a clone of it and hide the original guy. We're going to call this guy low res. So we have high res, we have low res.
Hide the high res guy. On this, I'm going to add a pro optimizer. This is going to reduce polygon count while trying to keep the volume of the object or the integrity of the object.
And notice the polygon count. So we went from about 400 and something thousand to about 100 and something thousand, and we see no difference or practically no difference in the model itself. Right, so this reduced model and the high risk model are both going to be used in a pretty cool way.
So I'm going to now select both my assets, and I'm going to level sync them both into my project. And now they appear, and there's no light baking on them. So what I want to do is grab my high res assets.
And I'm going to tell it to not contribute to the light baking I don't want it to be used to calculate direct lighting onto my floor. But I want it to receive light baking. I want it to receive lighting information and have light baking on itself. I'm going to grab my low res asset and tell it, I don't want to receive any light baking. I don't need to light bake onto itself. But I want it to be used to calculate the shadows.
So the next thing we need to do is tell my low res model not to be visible-- open it up in the Unit Editor, turn off visibility. And now it's hidden. It's no longer being calculated on Runtime. And when I bake my lighting down on my two assets and my floor, I have a correct setup for the most efficient way of baking your lighting.
And also, my dynamic lighting coming through from my window is going to be using the low res to cast shadows, which is going to just run faster. Basically, that's how that works. And then the next step with this is to bake your lighting down. I don't want to bake my entire level. That would take too long.
So I select the assets that are being affected by what I've just done, the four and the two sofas, and hit bake selected and rerun the solution. Takes a few seconds rather than a few hours. Cool?
So the next thing we're going to do is deal with reflections. I'm going to add a couple of reflection probes in my scene. This is going to help capture surrounding objects and big those down as reflection maps onto my assets, that I have reflectivity on them. And we're going to combine this with real time reflections a little bit later to really help sell it and bring it to life.
So bake your light reflections down. Now things start to look a lot better, start to look a lot more realistic. And then the next step here is to do some light balancing.
So if you grab your midday shading environment entity, this is where you're going to affect sort of all your global light settings and all your post screen effects. So here, I added this some exposure. But notice that I go outside.
It's really blown out. So what I could do add an auto exposure. And that would automatically adjust exposure based on where I'm located in the project. I go outside, it tones down. I go inside, it tones up.
We can also add some screen space AO. This is really nice. We have a viewport pass to really see what you're doing as you're modifying the settings. And then, of course, this gets comped on top of the beauty pass.
So you can turn it on, turn it off, turn it on, turn it off. Screen space reflections really-- for me, they're kind of the cherry on the sundae. They really bring it all to life, really add those contact reflections. So you merge that or so you combine that with baked reflections to really bring the whole thing to life.
And then other post screen effects like bloom and vignetting and lens quality-- now the thing to mention about all these post screen effects is they will affect your performance. There's no doubt about that. You can-- I don't show it here, but there's a way to track your performance using the performance HUD, artist.
And it basically overlays. An overlayed graphic that tells you everything that's being calculated and what is the heaviest and what is taking the most time to calculate. Post screen effect take a long time to calculate. They're just heavy. So just, as you're adding them, gauge the performance depending on your hardware and that kind of stuff.
BRUNO LANDRY: And also, some of them look quite--
JOSE ELIZARDO: Yeah--
BRUNO LANDRY: --VR.
JOSE ELIZARDO: Yeah. There's other things too that you just don't want to use in VR. Like, you want motion blur. I showed you bloom, and you probably don't want bloom. You don't want lens quality because that'll probably make you sick. But if you do want to use them and you want to create a sick-inducing experience, now you know how.
AUDIENCE: How does it work on outside stuff?
JOSE ELIZARDO: Which one?
AUDIENCE: The [INAUDIBLE]
JOSE ELIZARDO: That was all inside, yeah. I've done it on-- we had a model of a sort of store, a restaurant that had a parking spot outside. Sorry?
AUDIENCE: [INAUDIBLE]
JOSE ELIZARDO: I had trees reflecting on-- you know, cars reflecting trees and stuff. It looked good. Looked good. Yeah, I just got to set them up properly.
AUDIENCE: [INAUDIBLE]
JOSE ELIZARDO: Yeah, because what I don't show in there because it gets a little technical is there's bounding boxes to the reflection probes and fall offs and stuff that you can access. You can really precisely place them and have them blend into each other properly and have the right size and right limits and stuff. I don't show all that, but yeah, it's all there. You can very, very precisely-- yep.
AUDIENCE: Can you optimize it as SAO [INAUDIBLE]?
JOSE ELIZARDO: Yeah, yeah, there were some settings in there. Screen Space AO, there's a half res setting. And then Screen Space Reflections, there's passes, samples and passes. By default, it's set to, I think, four passes in one sample--
BRUNO LANDRY: Yeah.
JOSE ELIZARDO: --which is kind of low. Performs really well, but it's kind of a bit grainy. Depending on your hardware, you can go up.
I've gone up to 16, 16, and I haven't seen any-- you know, smaller scenes there, but yeah. Yeah, we have settings in there to optimize and adjust them, yep. Cool. Interactivity-- this is kind of my last kind of bucket, right?
BRUNO LANDRY: [INAUDIBLE]
JOSE ELIZARDO: No, there's more? All right, keep going? So the first thing we're going to talk about-- we're going to build this up-- is physics.
So we're going to start simple, and then we're going to work our way up to more complex stuff. So the first thing you want to do in a VR experience is teleport. And by the way, our templates, our VR templates, come with a teleporter-- let me just pause that for a second-- come with a teleporter, you know, as part of the VR experience or part of the VR template.
It comes with laser pointing functionality that I'll show you guys a little bit later. All that stuff is pre-baked, quote unquote, into the template. So you don't have to write your own teleporter or script your own kind of basic commands like that.
By the way, the other thing I want to mention is everything I'm doing is non-scripting. I'm not a scriptor. I'm not a programmer.
So it's all out of the box tools. It's all based on Flow, which is our node-based editor. I'll show you later.
It's all artist-friendly tools. I am not a scriptor or a programmer. I use everything that's not programmery.
So if I want to teleport on surfaces, as Bruno showed you earlier, I have to create a physics actor. So I just created a static actor on my pavement. I'm adding a player start. This is so that when I start the VR experience, I'm in this location looking in this direction.
And when I test it out and hit play, I can now teleport wherever I put a static actor. Of course, I would put this on all my floors. I just did it for the pavement to illustrate the workflow.
The next thing is making objects pickupable, quote unquote. I'm not sure what the word for that is. But-- what?
BRUNO LANDRY: It's a good word.
JOSE ELIZARDO: It's a good word. I think it's actually pickupable in the UI. I didn't make that up, I swear.
So basically, the ability to grab things and move them around, right? So that's a kind of next step in VR that you want to do. So it works the same way as with teleporting. You got to add a physics actor.
And in this case, instead, we're going to add a dynamic actor rather than a static actor. So same thing with my pot here-- I'm going to open it up in the unit editor. I'm going to add a physics actor.
I'm going to make sure that its accurate template is set to dynamic and not static. And now I'm going to add a couple of static actors to my stove here and to my countertop-- same process as with the floors. Now when I go into VR, the way the template is set up, when there's a dynamic actor on an object, I can use a trigger of the HTC Vive controller to pick it up.
And it highlights yellow, telling me that I can interact with it when I get close to it. That's a behavior that's programmed and inherent to the VR templates that you can, of course, modify if you don't like.
So i can grab it. I can pick it up, can drop it. And what's the-- just drops.
BRUNO LANDRY: Yeah.
JOSE ELIZARDO: Just drops. And you can, of course, throw it or something.
AUDIENCE: Were there audio cues when you pick up?
JOSE ELIZARDO: You can do all that stuff, yeah. I don't do it here, but yeah, you can do all that stuff. It's kind of hard to tell. You know what I'm going to do? One second.
Yeah, so what you don't hear, and it's unfortunate, but in the template-- and you see how it's built-- is when you teleport, when you land somewhere, it does a sound. And that sound is all in the template. You can see how it's built.
There's like a little ch sound. You can't really hear it here because the audio is really low. But yeah, all that stuff is doable. Ah, that's so dark. Wow.
So animations-- so let's say you wanted to bring in this animation, for example. From 3DS Max. I'm going to first show you how to set it up. But then again, there's so many Max users in this room, you probably all know this stuff really. Do you?
Anyways, I have these garage doors. I want to animate them on a splice. So I have a shape.
It's kind of hard to see, but it's a yellow spline there. I'm going to path constrain them to the shape. And then they all go down to zero because they're all set to zero follow along path.
So if I go to the motion tab, I can locate my path constraint animation controller and its parameters here. And I have a percent along path value that I can address and I can animate, of course, as well. So I can adjust all my panels to be in the better spot.
But of course, now they're not really following their-- well, they're animated along the path, but they're not following the path. So to do that, what I'm going to do is grab my panels and set-- turn on Follow, put the appropriate axis, and just rotate back into place and then do this for all of my panels. And now I have a correct garage door animation that I can now bring into Max Interactive and trigger.
All right, so these kind of more complex animations, I would not do them in Max Interactive. I would do them in Max. I will show you guys some animation tools in Max Interactive a little bit later.
So now I have basically two clips-- open and close. And I'm going to separate these two out, isolate them into FBX files that I'm going to then read individually back into Max Interactive.
So I'm going to select my door panels and start off with my door open and give it an appropriate frame range-- 0 to 150. Hit go. And then I'm going to do the same thing with the second clip, 150 to 300. [INAUDIBLE] water.
And now I can bring those guys into Max Interactive. I'm going to make sure that animation is enabled. And I'm going to make sure that I'm creating an automatic skeleton, right?
So I'm going to set that and I'm going to give it a generic name, not open. Just garage door-- that's my skeleton name. Hit Import, make sure clips was enabled.
And everything comes through-- the garage door, the skeleton, and the clip. So I'm going to do the same thing with the door closed except that I'm going to remove everything except for the animation information.
I don't want meshes. I don't want materials. I don't want textured because I already have all that stuff already.
So I'm going to disable all that. I'm going to make sure that I'm assigning that animation to the skeleton that I previously imported or created on import, right? So assign it to the skeleton. Make sure Eclipse is enabled, hit Import.
And now I have two clips assigned to the same skeleton and the same mesh. So what can I do with this? So for now, we're going to keep it super simple.
We're going to assign these to keyboard keys as events, and then we're going to replace them with VR triggers on the controller. Cool? So I'm going to go into flow. Flow is our kind of programming environment inside of Stingray or inside of Max Interactive where you basically programmed logic without scripting.
And so we have a node called Play Animation Clip. As you have guessed, I can play an animation clip on my project. I'm going to set looping to false. And I'm going to copy it for the second animation of the door closing.
And I just need to hook them up to the actual garage object itself. So I created a level unit node that is associated to the garage mesh. And I'm going to hook them up to Play Animation Clip.
So now I'm going to create a keyboard key event set to key number one. But right now, you know, if I press it, what do I hook it up to? Stop, you know, open, close? How do I cycle through this stuff?
I can create two keyboard keys, and one does one thing, the other does another thing. But I don't want that logic. I want the logic that when I hit one key, it cycles through these animations. Right now, I have two, but it's going to be 10 animations on the same object.
So to do that, I'm going to create something called a Flow subroutine. This basically allows you to cycle through different inputs as you hit the keyboard key.
So as I hit the keyboard key, it sends an event into the flow subroutine node, and I have to just give it limits. We have two events-- Play Close and Play Open. So I'm going to give my lower limit one and my upper limit two, which is R2 events.
And I'm basically going to say, hey, whenever I press keyboard key 1, increase this number, and then output that number from the flow subroutine value out. Next thing I'm going to do is add a Compare Numerics node. And I'm going to compare the output from flow subroutine that is coming through from the keyboard entry, the keyboard event, with a specific number. Whoa!
AUDIENCE: I'm sorry.
JOSE ELIZARDO: That kind of went a little bit faster, but I guess it doesn't really matter. I'm kind of running out of time. No, I guess the video was cut. Little mistake there on the video.
Anyhow, basically, now when I press the keyboard key 1, I'm going to trigger the first animation. And then press it again, I trigger the second animation. This flow subroutine mechanism is really powerful because I use this for all of my interactions.
It's basically saying, hey, when I press this, spit out a number. When I press it again, increase that number. When press it again, increase the number, and send that out and do something with that output. It's really, really simple. It's really, really powerful.
So the next thing here is swiping materials-- again, same flow subroutine, just different output nodes to make that happen. Also, I have my four materials here that I want to swap on that wall. Same flow-- subroutine, this time I have a lower limit of one and upper limit of four because I have four outputs, my four Compare Numerics. And this time, I'm hooked up to keyboard number two.
And what I'm going to do now is create a node that allows me to swap materials on a mesh. First thing I did was create a node representing the wall. That guy's the wall right there, right?
And we have a node that is called set mesh slot material. And the way this guy works basically is I just need to tell it on my mesh what slot I'm going to swap the material with. Every mesh has material slots. Think of it like multi-sub object materials inside of Max.
Same thing here-- this particular mesh has two multi-sub object materials or two materials in its multi-sub-- wall beige and wall white. I want the wall beige, slot. That specific face is on my mesh.
And so I'm going to assign that slot to my material. I need to then tell it what material am I going to be swapping to. I'm going to do this one for every one of these materials. And then I'm going to hook it up to the actual-- copy this a few times, of course, and just hooke this all back up the way I did with the garage door that we kind of skipped through a little bit.
But basically, here. once this is all hooked up-- hook up my unit to all this and then go into runtime. And when I hit the 2 key, I'm cycling through different material swatches or different material types. The metallic green was his idea, by the way. Unicorns is my idea.
BRUNO LANDRY: [INAUDIBLE]
JOSE ELIZARDO: I've seen your house. There's lots of green in there. No beige, though. All right, next thing I'll show you guys how to use the story tool to animate, in this case, day and night. But you can pretty much animate anything inside of Stingray.
And I don't recommend using this tool for any animations that you can do inside of 3DS Max. This is really-- it's a very simplified Curve Editor or animation editor. And it's really-- really becomes really useful when you want to animate inherent Stingray or Max Interactive things, like the environment or like lights in Max Interactive or intensities and stuff like that. That's when it becomes super powerful.
If you're going to animate objects moving in-- we'll do that in Max and bring them in. It's much more robust. But nonetheless, if I wanted to create a story of my [INAUDIBLE] animated or my going into nighttime mode, I'm going to select my sunlight. I'm going to go into the story editor. And I'm going to create a brand new story and name it daytime or nighttime, I think I'm naming it.
And in here, I have access to the transformation parameters of the sunlight-- so position xyz, rotation xyz, and scale xyz. And here, I'm basically just going to give it a couple of frames. So select the tracks that I want to animate. I'm going to grab my position and my rotation, give it a key frame on frame one, on frame six give it another key frame.
Then I'm going to select the rotation y and modify its value to be 160. So basically, I'm rotating the light on the y-axis 160 degrees over six frames. Super straight forward, right?
Then you can go ahead and add other properties on this object-- in our case, it's the sunlight-- like, for example, the intensity, and animate the sunlight intensity over time. And you just keep doing this, adding tracks you want to animate from the object that the story is based on. You can bring in new objects into the same story.
For example, now, this story is based specifically on the sunlight. But I can say, hey, you know what? My environment, I want to add parameters from there in the story.
I don't want to create a second story. So I can add by hitting this button right here. It's kind of super small and obscure. That will add whatever object is selected into your story.
So I'm bringing in my environment panel. And boom, now I can right click on this guy and have access to all of my environment settings that I can animate in here-- exposure, screen space data, or whatever you want to animate. Depth of field-- that's kind of cool, right? Depth of field? That's a cool thing to animate.
So I'm going to add my exposure track because that's the next thing I would logically animate if I wanted to create a nighttime effect. Again, create a frame at frame six. Key frame at frame six-- set it to 0 or 1. And now I have the makings of a nighttime shot.
Then, of course, I would just keep adding to this and adding to this and adding to this until I have the exact effect that I want. In my case here, I think the next thing I'm adjusting is the intensity of the sky dome map. And then in some other stories that I'm about to show you where I went a lot further with this, I animated the self-illumination on materials to make them kind of turn on and turn off, lights turn on and turn off. This is my nighttime shot, my nighttime story that I created. Then I created a second one for daytime that I'm basically going to hook up inside of Flow.
So now what I can do-- this guy creates basically these story entities that are objects within themselves that I can use in other ways and other things inside of Max interactive. So I'm going to grab-- actually, I'm just going to go right into Flow and create a level story node, point it to the story that I want to trigger or initiate. And same subroutine mechanism, right?
Nothing's changed there. It's the exact same thing. Set my story day, my story night, and hook up-- if I'm less than two, play story day or night. If I'm equal to 2, play the second animation and cycle back, then cycle back and cycle back. That's basically what it does.
So now I can go into runtime, hit my 2 key, initiate my nighttime story, and then hit my 1 key or my 2 key and come right back into daytime. Cool? All right, let's keep going.
I think I'm kind of done with this one. Not sure anymore. All right, audio-- next thing we're going to do is how to show-- look at is how to bring an audio.
We should really push up the volume. No one's going to hear anything. This is the audio track and it's like-- kind of defeats the purpose.
Can you bring that up, Bruno? There you go. Leave that high, real high.
So I'm going to go into Wwise. This is our audio editing tool. It can be overwhelming.
There's a lot of things in here, a lot of buttons. But kind of stay with me. I keep it real simple when it comes to Wwise.
Basically, you got your Audio tab. And in there, you have tracks that I've imported, right? So I just brought these in. I'm just going to play them back a couple of times-- a little Shania Twain action.
It's always good. Hey, we're in Vegas. So if I want to bring an audio I'm going to right click on my sound bank. I'm going to say Import Audio Files, navigate to an audio file on disk, bring it in, hit Import. Then what I could do is, of course, play it back.
This is the theme song of Zelda. Any Zelda fans in here? We're like major Zelda fans. We're like geeking out in there. All right, man.
So what I could do now is I can double click on any one of these files. I can turn them into 3D audio. So I'm going to go into the Positioning tab. And I'm going to enable 3D audio. And I'm going to set it to a default preset. And this basically defines the range for which it will play.
When I go beyond 100 units in my level, I don't hear any sound anymore, and it fades off into the 100. And I'm going to set this down to something like 15, test it out, and then drag and drop that guy in to my sound bank, generate, and go back into Max Interactive. And now if I locate my audio file in Audio, New Sound Bank, I have all of those tracks that just got generated when I hit the Generate button, all right?
I can preview them in. And what I can do is take any one of these guys, drag it into my scene, position it exactly where I want on my little speaker, and now sound will be coming from this location and fade off as I move away from it. This green sphere is a distance that it can play for. So that's a little bit big.
Let me go back into Wwise, adjust those parameters, and regenerate-- save, regenerate. And now this [INAUDIBLE] gets smaller. It starts to make a bit more sense.
And what we can do is go right into runtime. As I get close to it, it's kind of hard to tell in this room because unfortunately, the audio is kind of crappy. But it's 3D sound.
So as I turn my head, I hear it from my right hand side. Turn my head, hear it from my left hand side. And move away, it fades off. When I go upstairs, it kind of fades off.
What we can also do, and this is one of my favorite workflows, is you can actually connect Wwise to your runtime as I'm play testing my level. This is ongoing right now. I'm going to connect Wwise, and the modifications I make to Wwise in here, they will be reflected in runtime as I'm in the experience.
So I'm going to connect to it. And next thing I'm going to do is edit my distance here. So I lower, and I immediately see that take into effect as I'm in runtime.
Get closer to it. And it should get louder if I got closer to it, but it didn't. And what I'm going to do here is kind of go upstairs and show that kind of-- it's gone, all right? We don't hear it no more.
Then what we can also do is we can add filters to our sound. We can create the effect of, hey, I'm behind the wall. The sound is muffled a little bit. So I'm going to go behind this wall,
I'm going to increase my distance so I actually hear it. Right? OK, and we hear it, right?
So now I'm going to go into my Properties editor here. I'm going to add a low pass filter. There's a whole bunch of different filters you can add to create all sorts of really funky sound effects. A little pass filter is basically-- in my mind, it sounds like you're behind the wall.
The sound sounds like it's coming from behind a wall. I'm going to add that, and that's basically the distance at which it can play back, it can be effective. I'll let you guys hear that.
BRUNO LANDRY: Made total sense.
JOSE ELIZARDO: It made total sense. So get out of the wall. And you can really fine tune, precisely adjust, when these effects take place on that line you see here while you're in runtime, which is really, really cool.
So unfortunately, the audio kind of doesn't work very well. But yeah, that's basically how you set up audio. And then you can create events and triggers with that stuff basically the exact same way that I showed you a little bit earlier. Because this is my last section, and this is about creating a UI in VR.
Before I jump into this, because I went down this road and I had a lot of walls building this out, I think it's important that I take a minute just to kind of explain some of the things that I discovered. Building a UI in VR or in how you interact with things is a whole different thing than building static UIs for any other application, right?
You're in it. It's around you. You got to think about how users are going to interact in VR, how is the UI going to be presented to the users. And nobody understands how to use an HTC Vive controller, for the most part. So keep it freaking simple, and anything overly complicated, people get lost.
What really helped me was to storyboard the UI that I wanted to create-- like, what happens at what moment in time? When you approach something, what happens? How do you interact with that thing, and what does that thing do when you interact with it? So storyboard that behavior really helped me.
So I'm going to go ahead and just drag and drop an icon. This is one of my UI icons I created it in Max. It's got a funny little animation on it basically telling the user, hey, use the trigger to do something.
And the behavior that I want to start with-- we're going to build this out. I want it to appear when I approach it, and I want it to disappear when I walk away from it. So the first thing we need to do is create that animation. I'm going to use story to animate the scaling of this guy from 0 to 100 right over about half a second. So add some frames just like I did with daylight or with the day and night thing a little bit earlier. And to create the effect, to trigger that when I enter a specific volume, I'm going to create a volume trigger.
So drag out my volume trigger. I'm basically going to go into Flow, bring the volume trigger, and I'm going to say, hey, when I touch the volume trigger, play that story. When I untouch, reverse play that story. So scale up, scale down based on where I'm located in the project.
So that's going to create the first effect. The second thing I want to do is because I'm in VR, right, I'm standing in front of this thing. I don't want to be flat.
When I go behind it, I want to see it still. You got to think about these things. And I didn't think about that at first.
And it kind of hit me-- oh, it's kind of weird flat UI stays static. So we're going to do is go into its unit Flow, right? So Flow is for the level, but it's also on a unit or on an object you can create Flow.
I'm going to access the head mounted displays rotation information. So as I rotate my head, that information is being tracked. And I'm going to pipe that information into the object, the unit itself's rotation information. So they're going to be connected.
So as I move my head, the object will move and always track and follow me. So it's always facing me no matter where I go in the VR experience. So if I go into runtime, it's gone, not there. Approach, it appears.
As I move my head, it's always looking at me. I go below it, on top of it, it's always facing me. And that's super important in VR.
Cool? So the next thing I want to do is I want to make it so that when I touch that thing, something happens. In the VR templates, the controllers have a laser pointing functionality that I mentioned earlier.
I'm going to leverage that laser pointer functionality so that when I laser point this guy-- I need a physics actor on it just like everything else-- I'm going to initiate a particular event. I'm going to go into its Flow. I'm going to create what we call external ends and external outs. Now this is a bit of an esoteric concept, so I'm going to posit explain this a little bit.
You've got to think of internal ins and internal outs as somebody talking and somebody listening. You can create an external out event of anything, which is basically something in the system sending out information And then an external event is listening to that information and doing something with it. That's basically what it is.
The laser pointer has a whole bunch of external out events programmed onto it. You can-- there's a click-- when I click the button, the HTC Vive trigger, make something happen. When I hover the laser on something, makes something happen. These are all called external out events that I can listen to from within flow.
So I'm going to create an external in event where I'm going to be listening to click. Click is the external out event of the laser pointer. And I'm going to play that as an external out event and send that into my level, and something else is going to pick that up and listen to that.
So I keep sending-- talking and listening to information. I also have a node here that sets the unit highlight. I can highlight a kind of glowey highlight around my object, give it a color.
I'm going to enable true. I'm going to access another external out event from the laser pointer, which is hover. These are all commands that are part of the template. You can look at the laser pointer itself and see exactly what those commands are.
And I'm going to say, hey, when I unhover, stop highlighting-- basically, what's going to happen there. And now what I can do is go back into Flow, select my object. And now I want to create a unit of it or a node of it.
Notice that I have a whole bunch of new parameters. I have click, unclick, touched, untouched, hover, unhover. These are all things that I put into unit flow.
So now I can use those outputs as events, basically. So now instead of using a keyboard key, I'm going to say, click out, be my initiator of the event rather than keyboard key 1. I mean, it's still hooked up. In this case, you have both. You can use one for testing and one for actual experience.
Now when I go into the experience, I hover on it, it highlights yellow. When I let go, I trigger the event. That's hover on, hover off, I think-- unhover, hover. I then click and click out-- again, all logic built into the templates. And I took that, and I basically replicated all over my level, all over my project for all of my events and triggers. OK, so I got a-- running out of time?
BRUNO LANDRY: That's OK.
JOSE ELIZARDO: We're good?
BRUNO LANDRY: Yeah, no problem.
JOSE ELIZARDO: Oh, wow, cool. I feel good. So the last thing that I do is deploy the whole thing. I mean, there's a-- you know, we could possibly spend lots of time talking about lots of different ways of deploying stuff. I think the main thing I want to talk about is, and a lot of you just hit this with Max Interactive, is when you package your project up to share it, if you don't change your default level, it's going to load up the wrong level, right? It's programmed to load up the VR learning level.
So to modify that, we're going to look at our project file, project.lua, open it up, look for that line of code-- line 17 in my case-- and replace this last little bit vr-learning with the name of my level. You don't see it here because I cropped it title bar, but it's like Alpha Vision Demo or something. So I'm going to replace that last VR Learning level section with my level so that in runtime or when I package this up, it's going to load the right level up.
Next thing we do is go to the Deployer tab, give it a path on disk, and hit-- give it a name, of course, and then just hit package. And boom, you create a self-contained zip of your entire experience out with loading the right level, basically. I mean, there's probably much more that can be talked about.
So everything I showed you is currently in the product. You can access this stuff today and work with the stuff. I'm going to pass a torch to Bruno. He's going to talk to you guys about the future.
Your minds might explode a little bit. I hope you brought diapers. That's a little bit stupid.
But I do have to caution you. I have to put this up. This is our safe harbor slide.
You cannot make any purchasing decisions based on anything you're about to see. We are a publicly traded company. If you do, we could lose our jobs.
We can, and we will. So don't, both of us. So with that, Bruno?
BRUNO LANDRY: All right. Thanks.
JOSE ELIZARDO: Have fun. Oh, that's right.
BRUNO LANDRY: Mic juggling. Mic gymnastics. All right, so I know I'm kind of the blocker between you and the drinks so I'm kind of trying to do it fast. But I think you'll enjoy that.
So basically, we were wondering, could we leverage VR as a design tool? So we use that for design review to do like an experience. It's all fun and games. But can we used that as a productivity tool with 3DS Max?
So I mean, since the early age of CGI, we've been producing 3D images and animation using a desktop-- so a keyboard, a mouse, and a 2D monitor. So now that we have access to those new pieces of hardware, can we use that to design with 3DS Max? And we believe that these are kind of the challenges with 3D vis.
It takes a lot of time, costs a lot of money, and it also requires a lot of expertise with very specific 3D software package. And I mean, this is our vision. So looking in the future, we want to empower both of those personas-- so an architect or interior designer and a 3D vis specialist.
So we want to empower them with the proper tools so they can achieve what they are supposed to do on that project. So they could collaborate, both of them, on the same data set remotely using AR and VR. So I'm going to show you how we'll get there.
So I mean, the collaboration [INAUDIBLE] persona is not always easy. And I mean, I had kind of a part of my career, I was a freelancer. And I had to deal with customer.
And sometime, you receive changes lists like this, which is, this is not that bad, I think. But it can get worse, like this, or even worst. And I mean, these are real picture I receive or, like, use case from my previous career.
And one of my favorites is like, Bruno, can you update these images by next Tuesday? And I reply, of course. Can you kind of list me like quickly what are those changes?
And of course, the answer was, unfortunately, everything has changed. And I receive a bunch of PDF and DWG file, and I add kind of circle into them to see what has officially changed. It was a nightmare.
Then hopefully, I don't know if you guys. I've been in this situation. I know I was and multiple time. It's an ongoing thing, right? So--
AUDIENCE: [INAUDIBLE]
BRUNO LANDRY: Yeah, probably. And grass?
AUDIENCE: [INAUDIBLE]
BRUNO LANDRY: So can we empower that guy with the proper tool, with a VR headset so he can place that toilet bowl where he actually wants it, right? So one thing we need kind of to figure out-- there is-- when you work with a game engine, there is an editor mode and a runtime mode, OK? So when Jose showed he was interacting with objects, so most of the time-- well, actually, all the time-- you are in what we call in runtime.
So it's super fun. You can interact with object using physics. So we can grab chairs, throw them around. It's a lot of fun.
But this is always in what we call runtime. So basically, when you start that VR experience, you had fun. You threw chairs around.
But as soon as you close that, you relaunch it, it's all gone. So it's kind of a dead end, and nothing is saved. And that's the big problem.
So think about when you restart the same level-- when you reuse the same level in "Super Mario" each time you replay it, it's always the same. So that was kind of our first problem that we needed to solve. So that's what we did.
So going back to my place, in fact, so I'm in the same step as I was before. But this time, we created what we call VR Viewport. And this time, we'll be in editor mode.
So basically, I just want to say quickly, like, this is not in product. And don't try to corner me tonight during a cocktail, when is this going to be released? I cannot say any version name and dates, so please don't do it, OK? So this is not in product yet. It's like tech preview. And I mean, this is where we're going.
So as soon as I start my VR editor, it looks very like similar as what you're shown in VR earlier. But this time, as I've mentioned, I'm in Editor mode. So I will interact with the object the same way I would in the editor.
So I can interact with any of those objects. This will trigger the gizmo. So I can actually select that chair and I can duplicate it, rotate it and I can edit any of those objects. Yeah, there is no physics involved since I'm in the editor.
So I can also navigate in my scene like this. I can change my height to be like a t-rex, which is super cool-- so again, creating instances. And I can navigate in my scene just like you've shown earlier, teleporting.
And I mean, when I want to place that chair, it makes total sense to go next to it in VR and place it exactly how I want it in my Max scene. So when it's done, I close the VR session. And everything in my editor is now exactly how I placed them in VR. So that was kind of the first step, all right? Next?
AUDIENCE: [CHEERING]
BRUNO LANDRY: People are excited. It's cool. So the next step, we need to send everything that we do in the real time engine back in Max.
So that's what's kind of the next challenge. So you've shown that earlier. Already, you could, like, sync the camera view.
So anything like you do in Max or in Max Interactive, we can sync those camera. So we kind of used that same logic and applied that to the object. So can we sync those transform back and forth between the two app?
So that's what we did. So we have a live transform mode. So if I select those apples, bam, and I move them, they will move automatically in 3DS Max.
And I can create new instances like this. Again, my Max scene will get updated, and I can go in Max also and interact with my scene. And any of those change will be pushed back also in the real time engine. So it's always synced between those two app. And again, creating instances in Max creates instances in the real-time engine.
All right, next. So I mean, playing with physics is super intuitive. But when you need to precisely place something in VR, it's not easy using physics, and it's all like-- it's a bit weird. So we created what we call a smart placement tool.
So you have kind of a menu with a set of tools that kind of trigger on and off and toggle the gizmo. You can snap the angle of rotation like this. Yes-- so I can also snap object-- although I can move them freely, I can snap them to the surface.
So the tea pot is always like snapping on any of those surface. And they can also activate what we call the smart grid. So I can easily align any object.
I can also measure the distance between them. I can stack them. We have an auto parenting mode. So I can just take the bottom one, and they will all follow along.
And again, if I take that block. I'll get the measurement and everything will follow along. If you do something weird, you can always undo.
So you can do multi-selection like this. And you can do new instances. And you can delete stuff as well.
And the next thing, you can actually grab something and teleport along with it. So you can go where you want and place it exactly where you want. So although, like, working with that is good, you can do very precise thing.
But when you want to do something more organic, you might want to use those physics. So right now, we are looking at combining those two modes. So if you want to do something more intuitive and, like, throw chairs, you can use the physics. But when you, like, need to rotate the chair five degrees to the left, you can do that very precisely. So we're looking at combining those two modes of interaction.
So combining this with V-Ray RT workflow, for example, this means that you could have a V-Ray RT session working in Max. So you could, just like it works right now, have that interactive rendering, edit your Max scene. And you could go in the interactive engine and move the stuff around.
And you will get your rendering updated while you are working in the real time engine. And of course, it also works in VR. So I can just launch my VR session.
And anything I will do in the VR session will update my rendering in 3DS Max. So this means you will have kind of the final result with the proper lighting, the proper materials and reflection. Because when you are in the real time engine, the goal is-- doesn't have to be beautiful because your goal is probably to output something in 3DS Max-- so in images or in animation.
So right now, we are just enabling the user to edit this Max scene but using the VR. And right, now everything is kind of occurring on the same machine. I mean, I think V-Ray RT, Max, 3DS Max Interactive, and a VR session on top of everything-- I mean, it was working on my laptop. But at some point, it was almost bleeding from its nose.
But you can leverage like the V-Ray spanner and render across a network. And we are looking at, like, accessing remotely from those sessions. And you'll see later.
We also integrated what we call a picture in picture. So as soon as you work in a shot-based scene, it's usually the case when you're doing interior design or ArchViz. You'll see that. So these four cameras were my cameras in 3D Max.
So they weren't ported in the interactive engine. So I can activate with my menu the-- we call that the picture in picture, our 2D monitor. So here, I'm getting exactly the point of view of that camera.
I can switch to another camera. I can even select this one and move it, and I will get immediate feedback of that new shot. I'm currently creating. So when you need to do, like, very specific modification-- and these are based from the camera view that you have set in Max. This is very useful to do just tiny micro change.
And you want to always look back. So how does it look like from my point of view in Max? And in that case, I'm using a scene with like four different shots.
So I can move a bit that stool. Others, it look from this angle, from this angle. So it's very useful.
And again, as soon as we add collaboration on top of that, let's say I'm in New York connecting to a session of one of my colleague in Boston. So I could have a look of that camera shot that he might set in 3DS Max. And I can edit that scene while I'm in VR without removing the headset. So I can do all of that while I'm in VR. It's kind of an "Inception" thing.
JOSE ELIZARDO: [INAUDIBLE]
BRUNO LANDRY: Yeah, exactly. And we are also looking at combining this workflow with the previous one. So let's say this is like real-time.
But let's say I want to have, like, while I'm in VR, how that scene will look with the proper lighting, the reflection, the material. So I could ask Max, can you render a scene for me? I might wait like 30 seconds. And then we'll get immediate feedback.
So we'll be in VR. And I could see like the projection shot while I'm still editing my scene. So this is kind of in progress.
Yeah, this is-- we love meetings, right? So VR collaboration-- in fact, this-- I received that video from my dev this morning. So it's quite new.
And so we are reusing the capabilities of Stingray as a multiplayer platform. So right now, we have two designers connecting to the same dataset. And one of them is looking at them.
So both of them can see each other using those avatars. And they can actually speak to each other, because we have also an audio plug-in. So they can talk to each other.
They can say, OK, go fix that kitchen. I'm going to do that living room. And those-- so my colleague were kind of messing up with my furniture and my place. That's just weird.
So both of them can be editing that scene. And I'm not showing it right now, but of course, I can connect Max on top of that. And they will actually edit my Max scene remotely in VR. So that's pretty awesome. At my home-- that's just also weird. I
Don't know if some of you have used Revit Live in the past or right now. So just so you know, Revit Live is app that-- it's kind of a plug-in and a service that lugs into Revit. So basically, it kind of optimizes your Revit scene and output to VR and real-time ready presentation.
And one of the happy accidents was to kind of use a Revit live dataset and try to edit it in VR. So somehow soon, any live project will be able to load directly in Max Interactive. And you'll just hit Play, and you'll get live directly from Max Interactive.
So this is exactly live running from Max Interactive. And this time, Max is not involved at all. It's straight from Revit, Revit Live in Max Interactive. And now what's curious-- can I edit that in VR? So why not?
So running my VR session like this-- so I'm in the Revit sample house. And I can move around and, of course, use exactly the same tool as I've shown you earlier. So I can change the layout of that living room. And again, Max is not involved. It's straight Revit live directly in VR. So I can edit that scene while I'm in VR. So I'm creating a new layout which makes total sense.
So this means you can have like an architect-- hey, have a look at this scene. Can you, like, give me some feedback and edit that layout? And you can-- so I had the t-rex mode, but we also have a bird eye view like this.
So I can go way up, look at my scene as a mini map. And I could act on the landscape. So I could create new trees.
Right now, I cannot add anything that's already there while I'm in VR. But of course, that's something we might be looking at. I don't know.
And when I'm done, I just hit Save. Since I've created new instances of my project, I hit play again. And I will go back in live with my updated scene that I've just tweaked while I was in VR. So this is pretty cool.
JOSE ELIZARDO: [INAUDIBLE].
BRUNO LANDRY: Yeah. So I will have my new super living room layout, which makes total sense. That's all. Yeah.
Yeah, physical lights-- it's not specifically related to VR. Yeah, I had to, right? So you talk a little bit about this.
So Stingray 1.9 introduced physical light in the engine. So as of last Friday or something, one of my devs kind of made this working correctly. So you have photometric light in Max.
You send them in Max Interactive, and they will be imported as physically based lights. And so you can act and modify those settings using those physically based settings-- so temperature, brightness. The only thing that's not supported yet but will at some point maybe is loading automatically the IES profile.
So right now, I can just add them manually. But of course, we are looking at this. And of course-- so this is an old demo. But still, it will work in VR
So hopefully at some point-- right now, we can select the light. We can move them. We can select the camera and move them. But at some point, you will be able to edit the properties of those lights.
So let's say you have a lighting designer. You could go in this room and decide what kind of light I want, what kind of IES profile. So he could design his light.
So this is a mock up of the interface the designer has been working on. So you could actually do the lighting design of a room in VR. And I have to-- yeah. So I'm-- Jose is so knowledgeable with--
PRESENTER 1: 10 minute.
BRUNO LANDRY: --with MSG, so--
JOSE ELIZARDO: Let me make this super quick. This is another happy accident based on some of the work that Bruno and his team has been doing. But who's not familiar with the Max creation graph?
Who is familiar or was familiar? All right, so MCG, you know, it's not a node-based scripting environment. You create tools with it in 3DS Max. You output different types of tools.
So here is one tool that's part of the sample pack that allows you to basically fill volumes with objects. So we're going to use it to fill this shelf with those assets that we see on top into those volumes. And these assets all exist already in the Max interactive project. They were previously exported.
But basically, this tool allows me to fill up those volumes with these assets and then randomize the seed, randomize the position, scaling, and rotation based on some values that I input. And what I could do with this guy now is basically just position him inside my room. Did I press pause? No.
BRUNO LANDRY: [INAUDIBLE]
JOSE ELIZARDO: Position them inside my room. I did press pause, huh?
BRUNO LANDRY: Yeah.
JOSE ELIZARDO: I thought so. OK, let's go. Put that right in the room.
And then what I could do now is these Max creation graph tools, what they allow for, or some of these more recent ones in the sample pack, is you can actually bake out that object and create instances of the objects they're referencing, right, just by hitting that Bake button. It's kind of hard to see up on top. So we're going to connect to Max Interactive. We're going to turn on Object Transform Tracking.
And now when I hit the Bake button, because those objects exist in Stingray and because this instancing is also being tracked, they automatically appear inside of Max Interactive. So I can take my object, my MCG object, copy it over, change the [INAUDIBLE] to randomize the new distribution of objects, and hit Bake. So super quick to do setups and layouts and that sort of thing.
BRUNO LANDRY: Yep.
JOSE ELIZARDO: Back to Bruno.
BRUNO LANDRY: We're almost done. But not yet.
So another curious accident-- I was wondering. Can we leverage those new workflows to actually anim in VR? So basically, I went back in Max.
I've selected a few objects, added a key frame on top of everything. And I've let the-- is the auto key mode activated? So went back in Max Interactive, launched my VR session.
He's using exactly the same tool and functionality that I've shown you before. I'm creating a new layout of this scene directly in Max Interactive using that new workflow.
So basically, I'm creating a new layout. And all those objects will get a key frame from their new location. And when I'm done, I just close that VR session. And that animation was actually animated in VR. So it's kind of a quite happy accident, right?
JOSE ELIZARDO: [INAUDIBLE]
BRUNO LANDRY: So this is not like a "Blade Runner," super cool like VFX. But still, I was able to animate that scene directly in VR. So we could go in VR, create some animation, and render that back in 3DS Max. So I think that's pretty cool.
AUDIENCE: [INAUDIBLE]
BRUNO LANDRY: Yeah. But it's kind of the beginning also of go in VR and act on maybe object and sub-object. Maybe at some point, you could have in VR your timeline, and you could-- let's say you work with a character, add some pose, and you could actually animate that director in VR at the one to one scale, which make, at some point, total sense.
And so you show earlier there is an animation tool in Stingray-- Max Interactive. Still gets me. And so the goal of that is not to have an animation being sent back in 3DS Max. So let's say I want to have that animation like the one you did maybe with the chairs.
So I can create an animation inside of Max Interactive using the same logic of creating auto key on my object. So I'm creating a new layout of those objects. And let's say I want to use that animation later in the VR experience like Joey-- Jose.
[INAUDIBLE] that part. I don't know why-- like Jose has shown earlier. So this animation could be used in my VR experience. And it was actually animated in VR.
So again, it was kind of a happy accident. But if you spend more than three minutes on this, you could get something quite nice, I guess. It's just a happy accident, again. And one of the last topics-- So I don't know if some of you are doing 360 panel for, like, cardboard or mobile VR or my favorite.
So if your goal is to produce 360 rendering, for example, in V-Ray, and the goal is to have those maybe in a Samsung Gear or like in Google Cardboard, since at the end, that experience will be not in VR, but in mobile VR, and you will see it at kind of human scale, it makes total sense to edit that scene in human scale while you are in VR. So in this scene, I'm not using my small house.
I'm using a 7 million polygon scene. It's a condo. And smooth as butter, same thing, no automation whatsoever-- just a regular send my whole Max scene in Max interactive. I just modified the layout of the scene. I can just let V-Ray render after that with my new layout. That's pretty cool.
So just to wrap it up, so having that new workflow means that you can play that video. No. No, that's awesome.
Just [INAUDIBLE]. Oh, [INAUDIBLE] show my wife. Sh, not yet! Just two up.
PRESENTER 1: This one?
BRUNO LANDRY: Yeah.
PRESENTER 1: [INAUDIBLE]
BRUNO LANDRY: No, [INAUDIBLE]
PRESENTER 1: OK.
BRUNO LANDRY: Sorry about that. PowerPoint, right? So this means that with that new workflow, I could, like Max user, edit my own 3DS Max scene using the VR headset and controller.
So I could act on this scene and edit the layout of my project. And I could-- [INAUDIBLE] both. Oh wow. It's almost over.
PRESENTER 1: [INAUDIBLE].
BRUNO LANDRY: Or I can work with Jose. So Jose could work in 3DS Max, maybe edit materials, edit the lighting, edit anything that you can do actually in Max while I'm in VR editing the layout of that scene. And on top of that, if we had the collaboration that we have shown earlier, so I could connect remotely to Jose's computer, edit this Max scene while he's changing the color of the chair to blue. Yeah, why not?
JOSE ELIZARDO: [INAUDIBLE]
BRUNO LANDRY: All right, so hopefully, this will solve some of those challenges with 3D vis. I mean, this will drastically reduce the time to produce 3D images and animation and also reduce the cost of producing those. And it's also, I think, the most important thing-- kind of opening 3DS Max to a wide range of new users. And it's very lowering the expertise needed to work with 3DS Max with those new tools.
And I mean, I tried at home. So I've run my hours in VR, and I asked my wife, hey, can you just say-- like, let me know if it worked correctly. And so she's on PJ in Vegas.
And it's filmed. So please don't tell my wife because I might-- I have to sleep on Jose's couch for a while. And so I went to take care of the kids.
I went back and it had been like 10 minutes. She was in VR just kind of remodeling the house and moving the furniture around. So I think it works. And I mean, she did some solid works on Inventor in the past, but she never touched 3DS Max. So she was actually able to work with 3DS Max without touching 3DS Max. So I think it's pretty awesome.
So again, that's my core message for you today. We are looking at-- this is our vision. So we want to empower both of those people, the architect and the interior designers working and collaborating remotely using VR with the 3D vis specialist. So both of them could collaborate remotely using the same dataset. And hopefully, this guy will be happy. And this guy will be happy as well.
So this is my call to action for you guys. This is the slide you want to take a picture of. You can actually access this right now on the 3DS Max beta.
It's available. You can try it. You can break it.
Either you love it or not. I would like you guys to let us know because our goal is to make a better product for you guys. So again, it's not released it's in beta. You can try it. Don't try to corner me tonight or tomorrow because we're done.
JOSE ELIZARDO: We're done.
BRUNO LANDRY: Yeah, so this is kind of just a closing word. Yeah, so there was a lot of love and work on that presentation. So although I've spent a lot of time with you and I enjoyed that, I kind of want to go back to my family now.
And I can finally watch "Stranger Things 2" and play "Super Mario DC." So that's going to be awesome. And we'll be at the-- in fact, you can try it right now this demo at the Steel Case booth in the Future of Making Things area.
And also you can go at the Answer Bar. We have a 3D to VR booth. So you can pick our brain. And that's it. You want to add something?