AU Class
AU Class
class - AU

Hit the Ground Running with Unity and Forge

Compartir esta clase
Busque palabras clave en vídeos, diapositivas de presentación y materiales:

Descripción

In this class, we will look at setting up a project in Unity to use the Forge platform. We will build a template together that can be reused as a base to let anyone jump in and get started with using Forge inside Unity.

Aprendizajes clave

  • Learn about how Unity and Forge can be used together
  • Learn how to set up Forge for use inside Unity
  • Discover Forge and Unity best practices
  • Discover different use cases for Forge in a real-time engine

Oradores

  • Avatar para Michael Beale
    Michael Beale
    Michael Beale is a Senior Developer Consultant since July 2017 for the Autodesk Developer Network and Forge Development Partner Program. Before joining the Forge Platform Team, Michael worked on Autodesk Homestyler, Cloud Rendering, Stereo Panorama Service (pano.autodesk.com) and A360 Interactive Cloud Renderer before working on the ‘Forge Viewer’ and Viewing APIs. Twitter: @micbeale Blog: https://aps.autodesk.com/author/michael-beale
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
  • Chapters
  • descriptions off, selected
  • subtitles off, selected
      Transcript

      MICHAEL BEALE: All right. Welcome everybody. Thank you for coming this morning, it's a little bit early. Anyone here do the run, the 5K run this morning? Just out of curiosity.

      MIKE GEIG: What?

      MICHAEL BEALE: I know [? Huma ?] did. So that's cheating. That was totally for you. We're here today and we're presenting the "Hit the Ground Running with Unity and Forge." My name's Michael Beale. I am with the Forge Developer Evangelist Team. I've been working with what it is now for quite a few years, and I joined the Forge team about 18 months ago. One of my jobs is to bring developers into the Forge platform and sort of get to know the technologies. We do things like sample code, proof of concepts, and sometimes presentations. Here with me to present some of the work we've been doing with Unity is Michael Geig. Mike.

      MIKE GEIG: Howdy. So my name's Mike Geig, but that's OK.

      MICHAEL BEALE: I'm sorry, Geig.

      MIKE GEIG: It's all right. My name's Mike Geig, I'm an evangelist at Unity Technologies, and I've worked there for five and 1/2 years building learning tutorials, demos-- stuff like what we're going to show you today. It was actually an interesting opportunity to work with Michael on building this. We met a couple weeks ago and he's like I'm doing a presentation and I said that I'm making a demo. So our powers combined, and here we are. And so hopefully we can show some cool stuff bringing data in and maybe blow a few things up.

      MICHAEL BEALE: Yeah, I like the sound of that. So some of the things we're going to learn today, we're going to talk a little bit about AR/VR in the industry, and try to get you guys interested in this area. And then we'll talk a little bit about the Forge AR/VR toolkit. We'll then go into how you can use this tool kit with Unity, and then we'll explore some of the Unity use cases.

      So AR/VR in the industry. So last year we were presenting a presentation with Connect Tech-- a company that's doing structural engineering-- and we were demonstrating using the Apple iPad ARkit-- a way to-- we're demonstrating a way to augment BIM 360 issues on a structure. So in words, you would take the actual model from Revit, and the BIM 360 issues, and you'd combine the two of them in this iPad app essentially. Some of the things that come into play here is that Apple ARkit has position tracking, which was new at the time.

      So we put it all together and the way we did that -- the we did that was with Unity at the time. So some of the things that are interesting here is that it's not just about visualizing that structure over the top of something else, it's also about contributing information back from the real world-- getting these XYZ positions-- so like the edge of a table-- that you capture that XYZ position, and you send it back into your database. Into in this case BIM 360 issues. So that makes it a little bit easier than you sort of pulling up an iPad and tumbling through a 3D model. And here I am using the [? awesome ?] merge, and I should say, is using the functions of Siri a little bit more than just tapping on a keyboard. But being able to be a little more hands free to be able to use all the power of the technologies to come up with a better way to do things, better tools.

      Another example in the industry are Hololive-- you guys, if you were down in the expo floor yesterday they were also presenting their tech on the floor. Essentially, if you've got a building like this, and you wanted to sort of visualize, I wonder what that piping would look like on site. And how would it overlay-- how would the new retrofit look in this room. So HoloLive has some technology that takes Navisworks clash data and Navisworks data, and overlays it to the physical world. So it's a way of being able to say, having X-ray vision if you like.

      And then another example is-- another example is in indoor guidance. So let's say you're on a factory floor and you want to direct somebody how to find a pathway through a large factory floor. So an example is, inside of navigation-- I have a nice way of keeping track of the floor plan over very large distances. They use a QR code technique, and they place these markers around the factory floor and VW. And they can do things like overlaying a floor plan, or a path, as well as overlaying 3D data such as the robot arms et cetera.

      And then the last example I'm going to talk about is collaboration-- VR collaboration. Another expo floor demo was with-- was with NVIDIA's Holodeck. And they were sort of demonstrating how VR can be used for collaboration. VR is not just a single entity. You can use VR in a collaborative way, and it's like teleconferencing. I have experts in one part of one country, and they're trying to fix, or maybe review something in another country. In this example here, we've got a car. We've got a few people. Maybe some of them are located in, say, Munich. The design team in Munich, the repair team, or the business analysts, are in another. Or maybe the designers are just simply looking at the interior or exterior of a car and trying different car options.

      And then sort of pivoting a little bit from that VR experience, is the augmented reality experience, or AR experience. Again, similar to what you saw with overlaying images. It's kind of like Star Wars where you sort of see that hologram in front of you. This is Microsoft's Remote Expert. And it's a way of-- one of the things I really liked about this video was it demonstrates how I can pull other people into a collaboration and solve a problem.

      And what's a little bit unique about this is that it provides sort of a spatial mock up. I can do things like, hey, I can see there's a problem right here in a spatial way. Complex wiring diagrams-- those sorts of things have been shown by companies like Boeing and General Electric, that these augmented reality workflows can really speed up people's process. Sort of new tools. So hopefully these are sort of sum the experiences in the industry that inspire you guys to come up with something new.

      So then that sort of begs the next question. There's push button solutions out there, but there's going to be times when you need to build something that's very specific to your industry, your business. So the question then for you is, how do you build this very customized AR/VR experience? So building things like 3D widgets. It's going to be different for every different-- it's going to be different for different use cases.

      Maybe you're going to have a button that you put your hand through it, because there's no feedback. Or maybe you have to loop around something-- something that's a little more complex than what you're familiar with in 2D web page, or windows graphics user interface. And the second thing that's really important is having a community-- a developer community-- and that's kind of what the evangelists, Unity, and the Forge team are all about. We're here to help you guys when you get stuck with trying to build these new and complex workflows. And that's why we're here Unity and Autodesk.

      But one of the pain points with game engines, and real time interactivity like this, is performance, is look and feel. It's essentially what we call data preparation. And you know it's been shown, we've got some statistics around this, that it's been one of the pain points that the community have been telling us about. So those traditional workflows where you would say-- maybe you would have a Revit. You'd be working in Revit, you'd create-- or maybe you invent, and you'd create a robot.

      You'd probably like convert it or export into something like FBX. And then you go and imported into say 3D Studio Max. Then you'd suddenly find that all the animation, or all of the materials, had broken. And then you'd have to recreate those skeleton structures for the animation. You'd have to remap all of those materials before you can then re-export that into something that's useful for something like the Unity engine. And then you could finally bring that into an AR/VR experience. So that's a lot of work.

      So this is where the AR/VR toolkit comes in. So what is the AR/VR toolkit? Essentially it's a way of doing that data preparation automatically. Now it's not perfect, but it's automated enough to be useful. So the idea is, as your project-- a large project maybe over a couple of years-- as that building is changing the Forge toolkit will continuously bring in new data without-- the three geometry, the materials et cetera-- without that manual data preparation every time.

      So instead of having a snapshot of that FBX file that gets old after the first 6 months, and you need to re-import an FBX and data prep it, this is a way of automating that process. So the new workflow, it's a lot simpler. You're literally going to be authoring in Revit, or Inventor. And as you save those files, that data will be translated with the Forge services, and then you can automatically pull that into Unity. And then you can build your-- and then in your AR/VR experience is automatically updated.

      So this is all part of the data at the center concept. That same data that you were using for issue tracking, or connecting with cost analysis, is the same thing that you can visualize in Unity. Whether it's properties, or material information, it's the data of the center that we're trying to focus on. And another part of the Forge platform is you're inputting-- or you can handle many different file formats. The Forge platform handles 60 plus file formats and we'll be demonstrating a couple of those. Things like Revit, and I've been mentioning Inventor, we also do SolidWorks, CATIA, PRO-E, and a lot of others. And we use the model derivative service if you guys are familiar with it.

      Actually a show of hands very quickly, who here has worked with the Forge platform? Excellent. And my next question, who here he has been working with Unity? And then I guess one last question, and anyone who here working with 3D Studio Max? Awesome. OK.

      So what I'm going to show you now is sort of the getting started where you can get started with this package. So there's a website where we came up with. It's a side-- it's a side website to the main Forge website, and we call it a forgetoolkit.com. Now this is where you can download the Unity package. You can also find API documentation. And it's also got some sample code. And we have samples for putting-- getting that data into Unity, and then deploying it to different devices. HoloLens, for example, and Oculus.

      So I'm just going to quickly show you that web page. Looks a little bit like this? Any second. And you can find tutorial information, this is the download section where you can download the package. And there's a showcase where you can find examples. People have been-- if you want to find some examples of what's been going on with the tool kit. And then API references, and an introduction.

      So what I'm going to show you guys today is-- one of these is the basic Hello World tutorial-- I'm going to show you the first part of it, and then I'm going to hand it over to Mike, who is going to then-- well we're going to see-- we're going to see what we can do with Unity. So let's get started on what this Hello World example looks like. The Hello Word's got-- just to make life a little simpler-- we've came up with this sample gallery site. So this is going to save you time getting up to speed with the Forge side of things. Being able to, say , create a Forge app, and then creating a mini server, and then uploading files and converting. We've taken a little more of that pain out so you can focus a little more on the Unity side of things.

      MIKE GEIG: All of these links will be posted at the last slide too right? So you can take one succinct picture of all the links that we have.

      MICHAEL BEALE: Yeah, you can find these links-- you can take photos now, but we'll have a link at the very end with the GitHub repo as well as these web page links. So let's take a little look at that web page. So here's my-- I'm going to expand that a little, that's a little better. Let's make this a bit bigger. So I've got a couple samples here. I'm going to take a look at-- let's take a look at this first one. This is a Revit scene, a basic Revit scene. Just going to refresh this.

      MIKE GEIG: We are on conference Wi-Fi, so please bear in mind with that.

      MICHAEL BEALE: So I've got a little series of thumbnails you can try out here. And these samples here focusing on a couple of different things. For the Revit scene, obviously, we're targeting sort of architecture. But this one is also taking note of textures. So when we sample this in the Unity environment we're going to make sure you notice things like, oh, textures are coming across.

      Another thing you'll also notice is properties. So I'm just going to click on the fridge here, just as a quick example. But then the other thing that you want to also take a look at is the PBR materials. PBR, physically based rendered materials. We pull over-- we pull over some of these materials-- we're still working on the mapping on that to make sure we get a one to one mapping between what we have in SVF, and what we have in Unity, and the standard material. I'm not having much luck here. Let's try this one more time. There we go. This is definitely conference Wi-Fi.

      So the thing this thing I want to point out here is, again, the properties. And I'm just going to click on this robot head. And I just want you to take note of the properties that are coming through [? U. ?] We've got appearance, we've got volume, area, density. You can typically use these things in say, the Forge services, to create maybe some sort of cost analysis report. That same data will demonstrate that inside Unity. And it's going to be a way for you to build a similar experience-- a cost analysis report, if you wanted to-- inside Unity. Maybe you don't want to do something like that. It doesn't sound very exciting.

      So to wrap things up-- oh I forgot to mention one more thing. So the important thing on this web page actually is you can take these URN's-- the token and the scene-- which have been pre-created. You'll use those to copy them into Unity and that URN is what will track your design. As your design changes over time your model will also change and update inside Unity. So here to demonstrate that, and a couple of other really cool workflows-- property panels. We're going to show some lighting stuff, maybe some explosions.

      MIKE GEIG: Maybe some explosion.

      MICHAEL BEALE: And obviously some VR. I've got Michael-- Mike-- to represent.

      MIKE GEIG: All right let's switch out over here. Good morning everyone. Thank you all for waking up, to be here. I know it's always the struggle. So here I have a Unity project, and I've got a few things in this Unity Project, but specifically right now I have the Forge toolkit here in this folder named Forge. And so I want to demonstrate just sort of the base Forge functionality before digging any deeper into other use cases and stuff like that. So once I import the Forge tool kit from the forgetoolkit.com, I'm going to have these sample scenes. And sample scenes represent a few different ways you may want to use Forge within Unity.

      So I could have some data set loaded from start up using a URN, an access token, and some CNID. I could do multiple data sets at once. But the one I'm going to use here today is this load with two legged, which is load with two legged authentication. And what that's going to enable me to do is it's going to enable me to basically treat this like maybe a client access application, where I want to provide a client some ID, some secret password. And then an array of different data sets that they can pull in real time. And so if I look here, I can see I have this Forge authentication two legged here, and this is where I'm going to store the OAuth keys, my client ID in secret, that I got from the Forge website.

      Now to tell it what I actually want to load I have this loader here, and I'm going to pass it into the URN that Michael was showing there, from the Forge website that we spun up. I won't need an access token, because this particular setup is going to get an access token at runtime using that client ID in secret. I could use multiple of these loaders at the same time to load in multiple data sets simultaneously, but I'm just going to do the one for now.

      And I'm going to hit play, and is going to do is it's going to pull down this data in real time from the Forge site. So right now what it's doing is it's downloading. It's converting all the nerves data into meshes. It's bringing all those materials converting them into Unity materials-- any textures that exist-- all the properties are being streamed and parsed and then applied to all these objects. And so that's what we're seeing here.

      Now granted again, conference Wi-Fi, so we'll give this a second here. It is going to be slower. This is being pulled directly from the internet, which is always super fun and risky to do live. But don't worry my other data sets I'll have pulled down offline, so that we won't have to go through this. We have two ways obviously we can work with this.

      Right now, we're streaming live. And this is great if a client wants to see everyday, boom what's new, boom what's new, boom what's new. But, if I know maybe my data set will change once a week or once a month, I can also pull data down for offline use. So it pulls down, it's stored locally, and that way I'm not going to have to go through this process each and every time I want to do something.

      Now, that looks a bit small, because it is it's a toy it's rendered the scale, and so I'm just going to cheat and make it a little bigger here. So I'm just going to hop over into my scene view where I can see things a little more easily. I'm going to grab this object here, and I am going to is to scale it up a little.

      There we go. And now we have this unnamed robot that no one recognizes for the sake of lawsuit copyright protection. And so I'm going to go ahead and click on this character's head, and we're going to see again all this information that came from the Forge side, all this metadata that I'm able to parse and build some logic based on and whatever.

      And we're going to see that here in a little bit. And again, all of these nodes have this information. And so I can click on these different messages and figure out what's going on like different body properties, and all that cool stuff. And this is, in a nutshell, the most basic use case of the Forge toolkit. We're just going to bring data in-- we see it and then we can look at it and that's good. But we could do more with that.

      The idea with Unity is we want to take one data set and we want to build a build out a whole bunch of different use cases. A whole bunch of different things. And while this is called the Forge AR/VR toolkit it's a little misleading, because we don't have to do AR/VR. We could do kind of anything we want. That's sort of the power of a real time engine.

      And so I'm going to leave this scene here. Don't where I am going to show you some VR as well. But I'm going to leave this scene here, because I've set up a few other use case scenes. So once you and I have here, a call scene loader, quite a classic.

      The idea behind this is maybe I want to build an application for a client that runs on, say an iPad, where they can load up the application, they can see a listing of all this data I've made available to them. And they can say, oh show me that one, and it will just load that one in real time. And they can see it and they can go back and say, well, now show me this one, or whatever.

      And so this is just a simple UI I've set up here. Let's say, I want to see this coffeemaker. What's new with the coffeemaker? So I click on that, and it loads the coffeemaker and there we see it. And again, all these materials-- all the color properties, the shininess-- all this stuff, that's the thing I set up in Unity that was streamed directly from Forge.

      The Forge toolkit did a conversion of those materials, turning them into Unity materials to work with our physically based rendering. And so if I say, all right well this metal knob right here, what is that metal knob? I can click on it, and I can see, in the actual application itself, a data display. This is using the built in Forge inspector, part of the Forge toolkit.

      If I look at-- actually my model right here-- I can see that it has this Forge inspector component on it, just by default, just automatically added. And what that'll do is then when, I interact with anything that's been loaded in, it's going to say, oh, you're interacting with this thing you probably want to see some properties lets let's generate a property panel and pop it up with all this metadata in it. In this particular example, we're cutting it at 11 objects, or 11 pieces of data, but we could filter it, we could sort it we could do all sorts of stuff with that.

      So this is one scene I also have. Let's see a football helmet, because why not. Let's see a black car, let's see some of this other apartment information, or whatever. And again, this is all data pulled directly from that site that Michael was just showing us.

      But we can do more with this too. This again is just a very simple way of interacting. This is a simple application idea. But we often have non-simple needs. And so let's look at some other ways we can use this. And so here I have just this black car scene as we were checking out earlier. Again some unnamed child's building block toys.

      And what we want to do is-- we want to maybe expand this look, at how it's constructed, see what all the various parts are. Basically what we want is an explode view. I promised explosions, we'll have some explosions.

      And so built into the Forge toolkit is we have this explode controller. That again, we can just drop this component onto a piece and that's going to allow us to interact with it. Now the explode controller itself won't do anything. It just enables the capability of exploding views and I have a lot of ways now that I can interact with this.

      Maybe this particular instance, what I want to do is I won't have a slider that I can just slide my finger across the screen and I will see this exploder view be applied. In order to do that, I can very simply create a slider in Unity here. So I just go to create UI and slider, and I'm just going to go ahead and take a look at that slider there and maybe make it a little bit bigger in my UI. Just so there's something to actually grab onto.

      So there we go, giant slider. And I'm going to tell this slider, all right, so when someone changes the value of the slider I'm going to add an event. And again no programming required here I'm just going to drag this black car on to say, OK you're going to manipulate the black car. You're going to go to this explode controller. And you're going to use this dynamic explode range. Now after doing all that-- if I just hit play to actually simulate this-- I can now just drag that slider and I could see an explode view that I can control here. So one simple way of handling that.

      MICHAEL BEALE: That's not a bad explode, but I don't know. It feels like it's not very cool. It's OK though.

      MIKE GEIG: Did that feel prepared?

      [LAUGHTER]

      I think that was organic.

      MICHAEL BEALE: A little bit, yeah.

      MIKE GEIG: I like that!

      MICHAEL BEALE: Yeah we need to work on that.

      MIKE GEIG: Yeah, so correct. That's-- these aren't really the types of explosions I care about. One of the things about real time interactive engine-- a game engine like Unity-- is we can do more. And to be completely honest, what we've discovered, or what I've discovered-- I'm a game player, programmer, I'm a gamer, I come from a gaming background-- that a lot of the difficulties that you have in your industries are trivial to us.

      And a lot of difficulties we have are trivial to you. And so the Forge toolkit is really about spanning that gap. Adding real time interactivity, adding just anything we want to do, a power physics interaction, and rendering and all that. Those are day to day for us. But then the volume of data that you deal with and the volume of accuracy that is needed, that's something for us that-- oh wow that's new-- OK let's figure out how we do this. And the Forge toolkit allows us to bring these dense data sets in so that we can have this power of doing a lot of cool things.

      So for instance, if I go to a scripts folder, there's a quick little script I wrote-- you can see it's just like five lines of code-- and it's called the way-more-satisfying-exploder. And I'm just going to drop this on here, and I'll just hit play again. And what the way-more-satisfying-exploder does is allows me to blow things up. All right, so let's say I'm testing the structural integrity of a black car, and now I can check this off as, hey, this is actually work that I'm doing here. And so I can still try to slide it, I don't even know what that would do. OK. So maybe we can't put it back together again. OK. So we have that.

      So obviously we're talking a lot about AR/VR, the Forge AR/VR toolkit, so let's start examining some of that. Let's look at what it requires to build an AR or VR application in Unity. Specifically I'm going to be doing VR, I have a VR headset hooked up here. And we're going to be using this data set here. And so this is the Barcelona apartment data set from Revit.

      MICHAEL BEALE: You can tell because when we try to walk through the doors they're a little on the narrow side.

      MIKE GEIG: Yes, we can actually examine that here momentarily. So we have this data set, and right now if I look at my game view, my simulated view, we see this. And if I press play there's nothing happening, it's a fixed view. And what I want to do is-- Unity has like a first person character controller I can drop in and just walk around, but I want to do any that. I want to do VR. And so it's setting up a VR application is very, very simple. I'm going to go to my settings for my player, and I'm going to say, hey, you're a VR application, and that's it. I'm done, I now have VR application. If I were to build this right now it would work for Vive, Oculus, Windows MR, Oculus Go, whatever. That's literally a check box.

      And while I'm in the editor I can hit play, and with my VR headset-- and everything's spinning up here, takes a second. There we go. And I'm in VR. If you don't believe me, see for yourself. If anyone doubts me. All right. So we have that. But let's talk now about some locomotion. So I want to move around in VR as well and there's a few ways you can do this. And I could just make it so if I hit forward on a controller, or whatever, it moves me forward. But that makes people like super motion sick, so we definitely don't want to do that. So I'm just going to use a script that's going to basically point at the ground and say, hey if I click while I'm pointing at that, just move me there, teleport me there-- that teleportation mechanic.

      There's a lot of ways I can decide, well where can I teleport to? Should I be able to get on top of the fridge? Should I be able to-- what can I do with that? And so again, that is a solution that's built into Unity. So I'm going to click on my floor here. And I'm to going to say to my floor-- if my zoom will work-- do it zoom you can do it. Well, all right. My zoom is doing something weird but I'm going to say, hey floor, you are navigationally static. And what that means is, if I go to my navigation tools in Unity, and I say bake, what that's going to do is it's automatically create what's called a NavMesh, identifying everywhere the player can stand. And its accounting the players with that I've set up here, to not allow them to get too close to walls. So now we can see everywhere we can walk.

      And I'm going to go ahead-- and I have some prefabricated objects that are just some models of generic VR controllers-- and I'm going to grab these. I'm going to drop them onto my camera, so they exist. I'm going to hit play again. And now what's going to happen-- I'm actually in where my head set this time for accuracy-- is these will track. Once they're on. Did they go to sleep? I think they went to sleep even though the lights are on. What happens? They're tracking now? OK. Takes a second. Don't worry, I can actually see them in VR. And so now I get this you know, hey, I want to teleport around, but I want to go into this room and there's a door here.

      So the next thing I want to do-- obviously-- is I want to make this door open up. So how do I do that? Again, fairly simple thing. I'm going to leave play mode. I'm going to find that door, which is this thing right here. And I have this prefabricated object again, called door, and I'm going to click and drag that into that doorway. And what door is, is just two sort of empty objects here that set up a box collider and a hinge.

      Because the actual door mesh itself-- whenever it was designed in whatever software, probably Revit-- actually has a pivot way over here. So if I actually try to open the door, it's not going to work the way I want. So instead, I have just this door and this hinge, and what this allows me to do is, very simply, take that door mesh right there and just drop it on that hinge. And so now as the hinge rotates my door just opens.

      Now what can I do with that? Well, I can go to my animation window and I could say, you know what. Let's create an animation here. I have some things in here already, I'll just delete those. So I want to create an animation. So I'm just going to hit record. I'm going to move the scrubber to say 20 there, so 1/3 of a second. I'll say add a 1/3 of a second. I want my door to be there.

      That's it. Now I'll leave record mode and there, I have an opening door. Now what I can do, is again, I can just hit play. And I should be able to go through this door. And there's the joke we making about the Barcelona scene with the narrow doors. Oh I animated it to be open to start, my bad. Anyway I'll show you a different one. So I don't go back. These were already set up. So there we go.

      MICHAEL BEALE: Wow, that's a really narrow doorway.

      MIKE GEIG: Yeah and I also forgot that doors, or animations by default, are set to the loop, which is why the door just kept going opened and closed. Then I should tell them not to loop, but anyway. We'll do it live. So at any rate let me fix that animation real quick. Cool. So there's that, and I can reduce down the size of this colliders on that.

      It's just behaving weirdly right now. But one other the thing we want to do-- so all this stuff this apartment thing and the teleporting around like that-- nothing about that really screams, hey, I needed to use Forge and Revit and all this-- I could do that with an FBX, I can do that with blocks, your can do that with anything. The reason we want to use this is we want that data. We want all this other stuff that's not just meshes, that's not just physical boundaries.

      And so let's say in this example, I want to be able to look at this refrigerator and see all the important information of it. Maybe I'm doing a walk through of the home and I want to inspect the appliances to make sure it's to my liking, or whatever. And so again, that becomes fairly simple. I showed you before that there is this Forge properties panel, and all you really need to do is just-- I just have a different version of that-- I'm just going to drop right on there.

      I don't need to hook anything up. I don't need to do any bit of coding. The Forge property panel is automatically going to recognize, hey, I'm on this thing and has some properties let's display those properties. And so, when I press play, it's automatically going to parse the data that exists on that fridge. We'll ignore those doors there.

      And if I look down here, I can see that it's going to fill this property panel up with that information. So I'm not going to bother with the headset there, but we will be to just walk through and say, OK, what's the information on this, what's the information on that, or whatever. I set this up to just be there, but I could make it so if you point at it and say, you, what's your information? The thing pops up with the information and it closes or whatever. There's a lot of ways that we can interact with this.

      All right so the last or one of the last things that I want to talk about here is with the-- you can hear my fans kicking off their-- is with timing, light of day. Now, this is a bit of a cheat here, because this house has no ceiling. So the lighting isn't going to be particularly accurate. However, we could maybe say we're building a house that is entirely sun-roofed, and then we'll just make that the plan. Or it's entirely a Plexiglas roof. That seems like it wouldn't work.

      So let's go ahead and see how we can do that. So again, I can hop here in VR. In a lot of things, as we're designing a space, we're looking for the accuracy of light. Where's the light going to come in? What's it going to do what's it going to look like? That sort of thing. So I could create like a clock on the wall that I could change. I could just make it to the thumb sticks on this advanced time.

      I could make some buttons. I could create it easily in animation that makes the sun go across the sky, so I can see what that's going to do day and night. However, being a game person-- and my wife often says I have a god complex-- I'd rather just grab the sun and move it. That way I can say, all right, so what's the sun look like here through these windows? And as the sun sets we can maybe just have some starry skies.

      So we can see that a house without the roof actually wouldn't be too bad, assuming it never rains. So I'm going to put this in Southern California. And just move this thing around to pick some lighting. Especially, I can teleport around while I'm holding the sun, because why not? And say, OK, well, what do we like about this?

      And again, it's as simple in VR as using your tactile inputs, just your hands. Grab it move it. Clicking things drag them around. If we wanted to network this to turn into a sort of a multiplayer experience, while we're designing a house, changing materials, inspecting properties. It's very simple. It's what Unity does, that real time interactivity, but networking, multi-user stuff like that.

      Now just to end real quick here with one last thing that is again always my favorite. I'm going to go to my apartment here. I am going to drag on my way-more-satisfying exploder. And we'll just bring my camera on here, because I want to be at the center of this explosion. And then we'll hop into VR, and we'll blow everything up. Oh!

      My mouse cursor was over the place button. So I left the play-view. All right, there we go. So don't build this in a tornado zone. Then there we go.

      MICHAEL BEALE: Very cool. Thank you very much, Mike. That was awesome. Thank you. So let's wrap things up at this point. You guys have seen a really good overview of pulling data into Unity, and some of the things you can do with Unity, and configuring those things to something that's more customized for what you are building for your business.

      So going forward. What we're trying to do with the AR/VR tool kit, and I know there's people in this room right now who are probably sweating bullets that I'm saying these things. We have some GLTF support planned. We're also looking at improving the speed of the download and streaming for this plugin. You might notice that there was a recent update, if you guys have already tried it.

      Actually, who's already tried the toolkit? Already a couple of people. Yeah, you probably noticed that it's been kind off a little on the slow side-- a lot on this side. So we've recently sped that up, and you should notice if you download the package today. You'll notice quite a nice speed up.

      We're also obviously looking into improving that data preparation pipeline. We've got a lot of tools internally from Autodesk. So things like level of detail. We'll be able to try to provide multiple different streams of LOD.

      And then you'll want to be doing things like querying for metadata, for example. I don't want to necessarily pull down an entire Navisworks scene, perhaps I just want to pull down the piping, or the plumbing. Or maybe just the structural information.

      So to be able to query, and ask a service to just bring me down the geometry and the metadata for just property information. So being able to do property queries. In addition, you probably also want to do spatial queries. A device like the HoloLens, or Oculus Go, are kind of limited with the number of triangles they can handle.

      And so you want to probably say, I'm walking around just on the second floor, just bring me all of the geometry materials for just the second floor, based on this bounding box. So a spatial query. And then lastly, we're also cleaning up things like materials for PBR support.

      And that wraps things up. You can find more information on the Forge toolkit website that I mentioned. And also the demo for the Hello World points to the second web page, which is toolkit server V2. And you can also find the repos for both the Forge toolkit and the samples that Mike's just shown today. You can contact me on Twitter @micbeale, and you can also contact Mike as well.

      MIKE GEIG: Just so we're clear, all of these links they are seeing here, they will also be able to find from forgetoolkit.com. So in theory they only need to remember the one--

      MICHAEL BEALE: Yep. That's true.

      MIKE GEIG: Yep.

      MICHAEL BEALE: Sort of. Forgetoolkit.com has got links to all of these other ones, yes.

      MIKE GEIG: But you can have all of the links if you want them.

      MICHAEL BEALE: Yeah, you can take a photo go for it. And with that, I would like you to thank Mike for presenting today.

      MIKE GEIG: Thank you very much.

      MICHAEL BEALE: I appreciate you coming out Mike.

      [APPLAUSE]

      MIKE GEIG: Who's got questions?

      AUDIENCE: So when you make a custom query, is there any way to query deltas in the models, when you're only pulling down changes versus entire models [INAUDIBLE].

      MICHAEL BEALE: Yes so we'll be basing-- what's the word-- the streaming format, I guess? Which is the same as what's used in the browser. We have this thing called BIM 360 Design Collaboration, and it's based on this OTG technique-- OTG file format-- and it's optimized for being able to take deltas between two versions. So if you have a building version one, building version two, just like in Git, you can do a diff, and it will just return you back geometry from the differences. That's very fast. And that same thing will be available inside Unity.

      AUDIENCE: In the next version.

      MICHAEL BEALE: Yeah. Not currently, but it will be. We're basically porting a lot of the tech that we've done. The Forge stack onto Unity.

      AUDIENCE: What tips do you guys have for handling an AR [INAUDIBLE]

      MIKE GEIG: So the question is about AR scaling, and I assume in Unity-- I can speak about that if you mean some other technology, feel free-- but you have a large data set, maybe a large factor or whatever, and it always boils down to scale.

      So if we have a data tracker that we're putting on a table that's size of an index card, and we're trying to show something to a scale that would be the size of two tables, we're going to have some tracking issues, depending on what technology we're using. Now that being said, there's a lot that can be done with a lot of the new ARkits.

      So maybe the ability to have a singular tracking, and then a large plane space, and then have that start to understand, not only the centralized tracking location, but also the plane and as it relates to the environment. So that you're tracking becomes much more accurate, even when it can no longer see that initial target.

      This is something that was started and kind of built by Euphoria some time ago, and now most the AR platforms have it. Where, as I'm looking at the target, it's learning about what's around the target. And as I move away from the target it's learning about what's around that. And so if you don't go really crazy really fast right off the bat, it has the ability to start saying, OK, well, I know this podium is here and this thing is here and so I know where that is in relation to this tracker. And the accuracy starts to get much greater.

      That being said, if the ratio of your original marker scale to what you're trying to perceive scale is massively different than you're going to have some tracking issues. So it's all about providing an environment rich enough in diverse imagery that the AR system can figure out these points of space, and build yourself some accurate tracking.

      And it's gotten a ton better. We have one down at the Unity booth you can check out later today. We put a data set on a piece on the floor and, then all of a sudden, we say, OK, scale it up to real scale. So it's huge, and we're walking around in it. As long as you go slow enough, it'll figure out, using the IMUs in the host device-- the motion sensors in the host device-- it will figure out where you are in relation to that original target. So it's gotten a lot better. But there's still something--

      AUDIENCE: Was that built in Unity or--

      MIKE GEIG: What's that now?

      AUDIENCE: Was that built into Unity or--

      MIKE GEIG: That's built into Unity's AR foundation, but it's also part of Euphoria ARcore and ARkit, if you happen to not be using you to build those apps.

      MICHAEL BEALE: Yeah?

      AUDIENCE: So with the modeling, a little more live, is that something that is built into the actual build? Or is it something that every time there's nothing, you have to rebuild it and send it off again?

      MIKE GEIG: Yes, and I actually didn't show you guys the build process. You go to File and Build and you got a build. It's really simple, but--

      MICHAEL BEALE: I guess you could show them.

      MIKE GEIG: Yeah, so the setup that I was using here with the client ID-- let me take a step back. So if we were to build this using something like this load at startup scene, where I am providing here a URN and an access token, that access token will expire eventually. And so I would need to keep rebuilding it.

      You wouldn't go about it this way. So instead, if I do like the load with two legged authentication, on providing a client ID and a secret, and just a URN, and I want to build that, then you wouldn't need to rebuild your app. It will automatically-- when you run the app-- every time, it'll say oh I know the client ID and secret, I know the URN. And I'm going to go out and I'm going to get these credentials. I'm going to pull those down, and I'm going to have it.

      MICHAEL BEALE: You should mention that.

      MIKE GEIG: Oh, yeah. And so-- wait, which part do you want?

      MICHAEL BEALE: That Forge import.

      MIKE GEIG: Oh, yeah. So I talked a little about how you don't have to put down live every time. Let's say your data is not going to change for the next period of time. You can do an import scene here where you're going to again put in project name, a scene URN and an access token and you can actually import this as a resource into your project that then would be a part of your build that you hand out. So you can either stream it live or you can bake it in.

      MICHAEL BEALE: And that was what Mike was showing with those four samples when he had the four screenshots of the images.

      MIKE GEIG: Yeah.

      MICHAEL BEALE: So he used that to pre-import to pre-bake the scene--

      MIKE GEIG: If you end up checking out the sample project, my scenes folder has all live scenes. And then I created a folder called offline scenes, where I just have copies of them with offline data that I've pulled down to either see streaming live, or see it as an offline resource that's just part of our project.

      And you can really go about it either way. But regardless, if you do a build-- this stuff if you're either loading it live with a client ID, secret, and a URN, or if it's baked into your project-- you obviously won't need to do a build. I'm sorry, let me take that back. If you're loading it live, you won't have to rebuild every time you change everything. If it's downloaded offline and part of your application and you make a change, then you would need to rebuild it, because then it's offline data inside your project.

      MICHAEL BEALE: Yeah, did you have a follow up?

      AUDIENCE: I did.

      MICHAEL BEALE: Yeah.

      AUDIENCE: So, what are your limitations on scale? The reason why I ask is that I work for a construction company, [INAUDIBLE] but if we had a large, say, high rise, and it was an architectural [INAUDIBLE] we wanted to load every time this thing ran, would that be possible or is that too much geometry?

      MIKE GEIG: Well, all things are possible. But it would be a significant download time.

      MICHAEL BEALE: It'd take you a long time to stream it down essentially. And that's kind of where the spatial query comes in or the properties query comes in. You probably don't want to download every single little thing. Maybe you would want to say something like, during the spatial query, only load only load the largest AABBs.

      So load the largest objects don't load the nuts and bolts, because I'm just not going to see them. Or maybe you're a walk through mode and it's the opposite. It's load in everything in this sphere, or on this floor. And so you're only you're only streaming in a small amount of data specific for your task. So you've got options. You could download the whole thing, but it may not be practical.

      AUDIENCE: [INAUDIBLE] technique [INAUDIBLE]

      MICHAEL BEALE: Yeah.

      AUDIENCE: Causing it to load later depending if your position [INAUDIBLE].

      MICHAEL BEALE: Yeah, and then obviously you would continuously query to say, I've just I've just moved to the second floor. Or I'm about to move to the second floor, start streaming in the second floor. But you have to provide that business logic. You have to start-- there's no simple way to solve every single situation. So we would just provide the APIs, and then it's kind of up to you to figure out a strategy.

      MIKE GEIG: That being said, asynchronous level loading for seamless game levels, is a very common thing within Unity. This is effectively that. And so we already have a bunch of frameworks and assets that exist. That handle all the logic for when to load the next chunk. And that loading with the next chunk could be from an asset bundle from a project or from the Forge toolkit. The logic exists, and the framework exists, because that's not a problem uncommon in video games.

      MICHAEL BEALE: And being sort of like the Forge evangelists sort of thing, we do provide samples. So you know it's not like-- we try to provide the most common cases. There'll be cases where there'll be a walk through example of streaming. There'll be another case of, I just need to see the piping, or just the infrastructure piece. So yeah, but also it's evolving. If you've got something that is a common use case then maybe we can try to add something like that to it as well.

      AUDIENCE: So let's say I got the model in Unity and I have everything I want shared with the customer. Do they need Unity on their machine, or is there a way to export?

      MICHAEL BEALE: So the question is that you have a Unity project, and you want to share it with a customer, do they need to have Unity as well? And the question is really answered by, if I share a Unity Project, then Yeah they need you to open it. But if I'm doing a build, like for PC, Mac, Linux iOS, Android, whatever-- I give them the bill and they just need that platform.

      AUDIENCE: He has this [INAUDIBLE]

      MIKE GEIG: Absolutely. So if I were-- so here, I've got this-- now I've turned a bunch of this stuff off at this point, and I have change scenes-- but at any point if I were to go to my build settings here, I can see I have a bunch of scenes set up my build. I could literally just hit build right now.

      It's going to create a build, and in this case I'm set up to be building for PC, Mac, and Linux standalone. I'm building for Windows x86_64 app. So this is going to build me a .exe. But I could say you know what, I didn't install the Mac and Linux modules, but I could build for Mac and Linux. Or you know what maybe I want to build for iOs-- obviously I didn't install that module-- I don't think I have any this module installed they are part of the installer, I can just add them. TvOs, Android-- hell, Xbox One if you really care-- Play Station 4, Universal Windows Platform for Windows Phone, or HoloLens WebGL, Facebook-- if I want to build a Revit based game for messenger or whatever.

      But it's really just as simple as you say, OK, I want this thing build it for me. We also have-- one of our services is our cloud build service-- where you can basically say, OK, cloud build, this is the repository of my Unity application. I want you to do builds for me every hour and just email them to me. So I don't have to type a machine doing builds, I have a build server that is just continuously kicking builds for all these platforms to me that I can distribute to clients or team members to review and stuff like that. Yeah, so if I-- right now I have VR turned on, because you saw me to hit that check box-- if I build this right now it's going to produce an .exe . If I were to then run it on my system, I would be in VR with my headset with all that stuff automatically working, because it's just a click of a button.

      AUDIENCE: [INAUDIBLE]

      MICHAEL BEALE: Yeah. So if you're working with say BIM 360 projects there's an example of how you can-- in the API you should say it supports three legged-- sorry, it supports BIM 360. Three legged authentication is a little more complicated. You do need to set up either a proxy server to handle essentially the transfer of an access token to the device. It's a little more complicated, but yeah, you can do it.

      An example would be that if you need to log in somehow and we haven't provided a UI for you to log in-- and I an example would be you're in Hololens and you need to log into a BIM 360 project. It's kind of fiddly if you're kind of tapping and clicking on the keys. So one example of what we did was we used a QR code. So you would log in on a screen and that would provide a QR code, you'd look at the QR code, and that would essentially transfer the authentication to the device.

      AUDIENCE: What would be the [INAUDIBLE] AR builds [INAUDIBLE]

      MIKE GEIG: PC-- oh with the AR-- I'm sorry, I thought you were talking about ARkits. So with the AR/VR toolkit I don't know that there's necessarily a minimum specification. I believe-- man it's on our website-- I believe now-- what's that?

      AUDIENCE: Like memory?

      MIKE GEIG: I mean you're looking at min specs. So basically Unity is meant to run on everything. And then the more stuff you add to a Unity project-- the more stuff you want it to do-- obviously that brings your requirements up. But I mean, I think our min spec is like Windows Vista and like two gigs of RAM maybe. I mean it's min spec. It's meant to run on as many platforms as possible. And I may be slightly off on that, but it's low. But as you load a larger and larger data set, you're going to require more memory and require more CPU. If you're doing high definition rendering you're going to require a GPU set up for that. So it's really up to you what the minimum spec is, it's really up to us. Not Windows 98. I know for sure about that.

      MICHAEL BEALE: You haven't tried Windows 95?

      MIKE GEIG: Maybe Windows NT. Windows for workgroups. Other question? All right.

      MICHAEL BEALE: Nope, got one more. Yes?

      AUDIENCE: Sorry, is there any limitations with [? DOS ?] or [INAUDIBLE] or anything like that? [INAUDIBLE] something [INAUDIBLE]

      MICHAEL BEALE: Yeah, so that's I guess related to the authenticate-- the three legged authentication question.

      AUDIENCE: Oh, sorry.

      MICHAEL BEALE: No, that's OK. So there isn't a straightforward way of doing the logging component. I haven't seen anyone do it-- actually, no that's not true. The Hololive guys-- I think I can say they've done that-- they will present a login box inside, let's say AR or inside VR, to log into your project and then perform the authentication. The alternative way we've been demoing it is performing the authentication to BIM 360, or team on a laptop, and then displaying a QR code, which your device then would scan. This is an AR device. So for VR you probably have to pull up a login box.

      MIKE GEIG: Let's see. I'm just looking to see what sort of gesturing and login applications already exist on the asset store.

      AUDIENCE: For Forge you need actually the web browser page.

      MIKE GEIG: Oh,

      MICHAEL BEALE: Yeah.

      MIKE GEIG: You can do http.

      AUDIENCE: The [? code bank ?] has to come back.

      MIKE GEIG: Oh, OK. All right.

      MICHAEL BEALE: Well, thank you very much!

      MIKE GEIG: Thanks, everyone.

      [APPLAUSE]

      Downloads