AU Class
AU Class
class - AU

How to Wow – Extending BIM into Virtual Reality for every project

Share this class

Description

Do you want to extend your workflows and use BIM models to create VR experiences that will wow your clients and project stakeholders?

This class will cover how Mott MacDonald, working with Autodesk consultancy, have been developing our own VR capability using Autodesk technologies.

This class is aimed at architects and engineers with a BIM background but little or no VR experience. We believe VR can add value to every project type and we will give you an in-depth look at the processes and workflows we have used to create VR experiences for Buildings, Transportation, Energy and Water projects.



We will start with the basics; VR hardware and how to prepare BIM models for VR using Stingray, Revit Live and 3DS Max. Fitting for Las Vegas, we will be 'upping the ante' with live demonstrations so we can share with you the variety ways we have brought our designs to life using video game technology, and how this is all possible without writing a single line of code.

Key Learnings

  • Explain how VR can wow stakeholders, bring design to life, tell compelling narratives and add value to all project types
  • Use Autodesk Stingray, Revit Live, 3DS Max and Infraworks 360 to create their own VR experiences
  • Follow some basic tips and tricks to optimize their BIM models for use downstream in VR
  • Understand key VR concepts including; hardware selection, user interface, locomotion and performance

Speakers

  • Avatar for Tom Hughes
    Tom Hughes
    I am a Civil Engineer that followed a passion of using technology to do my own job more efficiently into a career of helping others to do the same. After working on a variety of infrastructure projects during the early days of Mott MacDonald's BIM strategy and UK BIM Level 2, I joined a small team whose primary focus was the application of leading technology on engineering projects. As the small team has grown into a global network, I have the privilege to work with project teams and digital leaders from around Mott MacDonald. As part of my role I regularly get the opportunity to work closely with both our Autodesk account team and the Autodesk product teams. I am member of the Autodesk Civil Infrastructure Futures Community, the BIM 360 Customer Council, and I coordinate the Mott MacDonald Customer Success Plan. Alongside my role at Mott MacDonald, I am the delivery lead for the Centre for Digital Built Britain Digital Twin Hub (https://digitaltwinhub.co.uk). The DT Hub is an online community for people who are working to deliver digital twins and share a vision of an ecosystem of interconnected digital twins at national scale. Away from work I like to surf, bike, Xbox, and binge watch comedy.
  • Lionel Graf
    Lionel Graf is an implementation consultant with the Automotive Consulting Team at Autodesk, Inc. He has been working for over 14 years in the rail industry as a creative design production manager. He specializes in creative design visualization and communication, with a deep knowledge of real-time technologies and processes to achieve high-end visual quality for aesthetic design communication and real-time design reviews using VRED software, ALIAS software, MAYA software, 3ds Max software, and creative field standard software for image creation.
  • Toby Foot
    Toby Foot is a digital project specialist for Mott MacDonald as part of their global digital team. With over 10 years experience as a BIM technician specialising in Revit and Civil 3D, he has been at the forefront of delivering intelligent 3D models on high profile engineering projects worldwide. In recent years Toby has lead the development of immersive technology experiences from BIM models, particularly VR, through Autodesk’s new gaming engine; Stingray.
Video Player is loading.
Current Time 0:00
Duration 1:02:54
Loaded: 0.26%
Stream Type LIVE
Remaining Time 1:02:54
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

TOM HUGHES: The key word for today's presentation is storytelling. I love this graphic. The complexity of a good story is really depicted here.

When I was training to be a civil engineer, words like storytelling and design narrative were not part of the agenda. Emerging technology, virtual reality, and maximum interactive allows us to tell more compelling stories.

So I'm Tom Hughes. I work for Mott MacDonald. I'm a senior consultant with the VR digital leadership team. I've 10 years experience. And really, it's all about helping projects to be successful.

I'm an infrastructure BIM guy, so Civil 3D and InfraWorks. I focus on the collaborative management of information. The last year or so, I've been working with Toby and Lionel in extending some of those workflows and using BIM to our clients.

A little fact about myself, I once cooked a crawfish boil for 800 people, which was slightly less scary than this.

[LAUGHTER]

And I've got to say, I was so impressed by the presentation Skanksa gave on the main stage during the technology keynote yesterday. I'm maybe a little bit jealous, but relieved I wasn't on that stage. But also, great minds think alike. We're not going to be the only company doing VR. And we've got the opportunity to show you some of our experiences live in this forum, so you'll see actually what we've been doing, not only how we've been delivering them.

I don't know if any of the University of Amsterdam students are in here today, but I met them at lunch yesterday and I was really impressed with their enthusiasm for this kind of technology. And at such a young age, they were already fully aware of what VR could be doing for construction. So I think a real sign of the future.

TOBY FOOT: Hi, I'm Toby, project specialist at Mott MacDonald. Like Tom, I've also got 10 years industry experience. So before becoming a project specialist early this year, I had over 10 years experience as a BIM technician specializing in Revit and Civil3D.

And now part of my role as a project specialist is leading the development of our immersive technology. So that's the gamification of BIM models and VR like we're going to show you today. And a little fact about myself is, I'm a king gamer, hence all this stuff. But since becoming a dad last year and the pressures of the new job, I don't have any chance to play the massive amount of games I've accrued over my years. Lionel.

LIONEL GRAF: Hi, I'm Lionel Graf, so I'm part of visual consulting. So I help Toby and Tom on this particular project. So at Autodesk Consulting, I'm leading a small team of visualization specialists in Europe. And yeah.

So basically, game engines, 3ds Max Interactives, Stingray, it's the same. 3ds Max, Maya, VRED, so a lot of solutions. So our mission is basically to help our customer get their head around all the technology we have at Autodesk, how we can leverage that to take the most of it, and yeah. Find the right value for the right purpose, and support the challenges that we may-- or our customer like Tom and Toby may face.

TOM HUGHES: So Lionel is going to be the one that answers all of yours questions later. So this class is titled, How to Wow. So we've already discussed about using virtual reality to extend traditional BIM workflows. Linking back to the key word for this presentation is really a story about how we've done all the work with Autodesk to develop a repeatable process to allow us to use virtual reality on every project.

One of the things that we were very aware of with virtual reality was, we didn't want something that was just off the peg. And we didn't want something that was just a one-use solution. So it's one of the main reasons why we work with Autodesk to think about how we could do something that was a continually-developing template. We're going to show a bit of that as we go through today.

So a little bit about Mott MacDonald, only one slide. We're an employer and engineering management and development consultancy. We focus our expertise through global business sectors to make a difference on big issues. You can see the scale of our business there. So about 16,000 employees globally.

Let's cover our class learning objectives, because that's why you're here. Objective one is to explain how VR can wow stakeholders, bring design to life, and tell compelling narratives and add value to all project types. Objective two is to use Autodesk Max Interactive, Revit Live, 3ds Max, and InfraWorks 360 to create your own virtual reality experience. Objective three, if you're going to do objective two, is to follow some basic tips and tricks to optimize your models for use downstream in VR. Finally, objective four, to understand the key VR concepts including hardware selection, user interface, locomotion, and model performance.

So let's look at the agenda. Number one, we're going to look at our first steps using Stingray. Number two, we're going to look at the challenges that we faced. Number three, we're going to look at the success that we've had. And finally, four, we're going to give you some tips, tricks, and future plans of what we might be working on with Lionel next year.

So if I was an Autodesk employee, I'd be talking about a safe harbor statement here. There's absolutely nothing in this presentation that you can't share with your colleagues, and we really encourage you to do so. But COA is covering our ass.

So we are going to be doing live VR models. They may not work. They definitely will, because Lionel's been working tirelessly this week to get them up to scratch.

Also, when we first started this journey, Max Interactive was called Stingray. And quite often, we use the terms interchangeably. Don't feel confused when we say Stingray. What we really mean is Max Interactive.

Finally, the only other caveat about this presentation is the 90 frames per second that Toby will receive in his head-mounted display is not the same as we can mirror onto the screen. So don't be put off by any lag in the experience here, because what Toby's experiencing is much smoother, much more intuitive.

So let's start with part one, our first steps using Autodesk Stingray. Our journey with Lionel actually started two years ago at Autodesk University 2015. I attended a small class about something called Project Expo, which later became Revit Live.

So transportation, how people get from A to B, is of critical importance and a key issue within society. One of the projects that Mott MacDonald are currently working on is the Crossrail line in London. As part of the Crossrail strategy, all of the stations among a route are being upgraded so they're an access-for-all station providing step-free access from curb to platform.

Oh. Went too far. Sorry.

So one of the reasons that we wanted to look at a project such as one of these stations was that it had a lift in it, which is an interesting asset from a game design perspective. We've got something that we've designed as a fixed solid in Revit, but we want to make it move and we want to have some interaction with, some buttons.

I'm going to let Toby give you a demo about what we did with Stingray now. This isn't using the VR. It's just with a simple Xbox controller. But you'll start to see some of the key concepts that we'll build into later in the presentation.

TOBY FOOT: Can I?

TOM HUGHES: Yeah.

TOBY FOOT: So for this experience, we used an Xbox controller. I think it's not open. Oh, no, there it is. Sorry.

TOM HUGHES: First live demo, going well.

TOBY FOOT: Yeah. So I just light it up here?

TOM HUGHES: Yeah.

TOBY FOOT: Nearly there. Is that coming across all right on the screen? Oh, the PowerPoint's still running.

TOM HUGHES: Yeah.

TOBY FOOT: So.

TOM HUGHES: OK.

TOBY FOOT: Huh. Are we there now?

TOM HUGHES: No, we've still got the PowerPoint slide on the screen.

TOBY FOOT: That's weird.

TOM HUGHES: Is it open twice?

TOBY FOOT: Oh, yeah. And we're back to the desktop now. Good. Sorry about that. Cool.

TOM HUGHES: OK, success.

TOBY FOOT: So here we go. I'll maximize that so you can see it better. So this is our station model. And you can see we've added some context that was just a basic InfraWorks model.

And I can change between different modes, which you can see down at the bottom left. So I'm now in the fly mode. So you can fly down, fly around wherever I want to go.

And here's the lift, the access-for-all lift. So I'll pop into first person mode. And you can see our guy walking here. Approach the lift.

And now we've got this toggle here where you can change your height of the player, and at low height, it changes into wheelchair user. And so we can-- I'll zoom in a bit there into the lift. And you can see the graphical fidelity, and Stingray is really good. Nice reflections and bump maps on the materials.

Wait for the lift. And you can see that the lights even work on the lift. There we go. We've made it to the platform.

TOM HUGHES: And we've made it through our first live demo.

TOBY FOOT: Yeah.

TOM HUGHES: Thanks, Toby.

TOBY FOOT: OK.

TOM HUGHES: I will just change the clicker as well.

TOBY FOOT: Oh, yeah.

TOM HUGHES: Back where we started. Great.

TOBY FOOT: Good.

TOM HUGHES: Do you want to talk about how we built the lift?

TOBY FOOT: Oh, yeah.

TOM HUGHES: Yeah.

TOBY FOOT: So as you saw in that demo there, we had an animated lift. So this scenario with the train station, the access-for-all scheme gave us a good opportunity to try out some animated assets. So we created the lift in 3ds Max, and created the animations there as well.

And actually, it was all one long animation. And then we can chop it up when we bring it into Stingray, which is what you see down the bottom there. And then we create the logic for how the lift works, which is what you can see here. So that's like, lift goes up, open doors, pause. And how it works with calling the lift when it's at the bottom or top, all that logic.

And there's also-- you can't see it, but there's a trigger box in front of the door. So when you approach, it'll detect that you're there, like a real lift has sensors. And you can see that we even had working lights included in the Max model there.

So I'm not a coder at all. I don't understand any of that side of things. But that doesn't mean I can't code in Stingray-- 3ds Max Interactive-- because it uses a visual language called Flow which is based on LUA behind the scenes.

But this is how it looks with-- you've got nodes, and you string them all together to create the logic you want. So get the functionality. So this is the flow nodes that control the lift. And you can see, we've got boxes around groups of nodes.

And that doesn't actually affect how it works, how it functions. It's just really good practice when you're doing these Flow nodes to remind yourself which bit does what. So otherwise, when it gets complex, there will be hundreds of these things. It's a big-- lot of spaghetti.

TOM HUGHES: So I want to have a really quick look at our basic user interface within our early Stingray projects. So to begin with, we obviously needed to include some of our Mott MacDonald branding. And you can see that from 2015 to 2017, we've actually changed our brand logo.

And we've obviously got some product information. We've got, effectively, our player selection. We've got details about control. So not only can you use the Stingray model with the Xbox controller, you can also use basic WASD interface on a keyboard.

Then we've got the locomotion mode, so the fly-through, the walking grade. And we also introduced a bit of functionality to allow you to time how long it takes to get from curb to platform to give a better idea of how long it would take wheelchair users to get around the station. So thinking about some of the design challenges, and introducing those into our game's UI.

This is really early work, so you kinda look back now two years later and go, ugh. Looks ugly. But at the time, we were working quite hard, and I think we did a reasonably good job.

So what about the outcomes of that very first engagement working with Lionel and Autodesk Consulting? Well, eagle-eyed people among you will notice that the rendered image that we had in the original slide that we repeated here is a slightly different configuration than the game that Toby was playing. The model that we put in our live workflow is depicted by the schematic on top, and we had a 90-degree entrance and exit with the doors.

One of the insights of working in live and experiencing the design as a wheelchair user was how difficult it was to negotiate that 90-degree turn inside the confines of the elevator. Working with designers, that allowed the lift configuration to be changed to 180 through and through, which meant extending the canopy but giving a much better experience for the wheelchair users we're trying to help.

So in terms of those first steps, what were the lessons that we learned? Well, that Max Interactive extends the potential of BIM models. I think, undoubtedly, we can see that's true.

It's great for communicating design intent to people who might not have ever used BIM before. We got great feedback from our client. We learned a new set of skills. We had reusable template, so when we switch out the Revit model behind that, we can use the same functionality in any of our models. We also started to get to grips with things like user interface, and how we can put company identity into new types of products.

So then what came next? Well, we were challenged with producing six experiences that would showcase VR technology at our 2017 company AGM. As I mentioned when I explained about Mott MacDonald, we're an employer and company, so our AGM's actually all of our most senior leaders within the business.

For someone like Toby, the opportunities to do six models in a very short period of time was really exciting. For someone like me, it was really worrying. I knew that if we didn't get it right, we were going to look like idiots, which is why we engaged with Lionel again, and have worked with Autodesk consultancy to produce the experiences.

We were also battling the fact that during 2017, the hype around VR was perhaps as big as it had ever been. Everyone wanted to see everything happening all at once. And we really had to focus on what was most effective.

So thinking about that, what sort of challenges are we going to try and deal with? We also needed simplicity. So while there was many, many challenges on the project, what we need for the user experience for our board of directors was a simple, easy-to-use solution.

But we didn't want to have something that was a one-off. We wanted something that we could build on and use in the future. So there were some key issues there.

So one of the biggest challenges early on-- although we'd been asked to produce these models for the AGM-- is actually getting buy in of the importance of VR at senior level. So it can be perceived as being expensive, time-consuming, and that we need key skilled expertise. By working with Autodesk, we were able to do, we think, a very good job of producing high-quality VR experiences without having to hire anyone else, without taking too much time, and not at high cost when you consider the outputs that we've had.

One of the things we work with Lionel is focusing on the key areas, and what differentiated our approach to virtual reality. So we can see here across this use case mapping with external and internal use-case, we've got some key areas of focus.

So the first area that we looked at with Lionel was we wanted something that was showcasing the technology to important stakeholders with a lot of business. The medium-term use case benefit from opportunities to use VR within marketing at a industry event. And then longer-term, we were really thinking about collaboration, and how we could use VR to connect teams across the globe.

So our approach to buy in was prioritize the areas of highest value, was define success, and was to identify suitable project data that we could use quickly and efficiently. By doing that, we were able to pilot quite quickly, and apply lessons learned continuously. So Toby and Lionel worked very closely to iterate their solutions. After we'd done the buy in part, we had to think about some of the things that Toby really loves-- the cool VR and the hardware.

TOBY FOOT: Mm-hm. So we had a choice when we started out on this journey between two headsets-- yeah, you can go. So let's talk about the hardware first.

So the hardware that we recommended. Most of that doesn't really matter too much. Any modern PC has everything you need except for the graphics card. So that's the most important part to consider.

So a 1080 we found to be ideal. It'll handle everything we throw at it with ease. Obviously, anything above that would be even better, but they rise in cost quite quickly. And 1060 and 1070s are OK. It was a 1070 running on here, and you'll see it's nice and smooth. But 1050 and below is right out for VR. OK.

So yeah, we had a choice between two headsets when we started out. And we purchased both of them, the Oculus, and the HTC VIVE. But at the time, the Oculus didn't have room scale, like you can see here, and didn't have the motion controllers. If any of you are not familiar with room scale, it means that I can walk around this space and it'll track the headset fully in that space, rather than just having to sit there in front of one sensor.

And another reason for picking the VIVE is the precision of the tracking. The controllers here contract to submillimeter accuracy, as these use a laser grid shining down to try to control this. They're incredibly accurate, which is ideal for engineering projects. And it was so successful, from our one VIVE, we now have 13 across the company. And it's been recognized as the standard platform that we'll use going forward for VR, at least for this first iteration.

So as you can see here with the Alienware laptop, we also wanted to be able to take the setup around. We've already used it at trade shows, at conferences like today. And so we found the Alienware 15 that has the GTX 1070 is absolutely perfect for performance. It's a little bit big to lug around, but it's got the oomph.

So we also have, as well as the mobile installations and the PCs people might develop on at their desks, like I showed you earlier, we've started to set up permanent installations in offices. So the intent of these is that visitors coming and sitting in the waiting area can just pick up the headset, try VR, see some of our models, get a good impression of the company.

And the intent there is that this isn't a guided experience. People coming in can just pick it up, put it on. And the interface is intuitive, so they immediately know how to navigate, and they don't need anyone to tell them, oh, you need to push this button, do that. OK.

So I'm going to talk to you a bit about our workflows. So a key bit of information is that us-- and I'm sure most of you-- already have a lot of excellent VR-ready content, particularly focusing on InfraWorks and Revit with Revit Live plug-in.

So there's not a lot you normally have to do, especially with Revit Live, to make your models ready for VR. But if you've got a massive modeling Revit, just simply hiding of the elements, chopping out data you don't need will really improve the performance of your model. You got any other tips for improving performance?

LIONEL GRAF: Yeah. So what you need to know, basically, is from a technical perspective, everything needs to live inside the graphic card. So you will find, probably, three main things you need to look into for improving your VR performance.

Number of object and complexity will affect the performance. Size of the texture will be something that can be a bottleneck as well. And everything that will need to be calculated by the graphic card at runtime for each frame will be as well be a challenge.

So the things like real-time shadows, particle effects, all of that will-- it's cool to have that kind of thing in design, but it costs a lot photographical. So basically, for the sign complexity, as Toby said, you can choose the right data to display. Most of the time, you don't need to have everything.

For a particular VR experience, you will probably need to take some decision. And those decisions will be helped by some object, not all the objects. So choosing the right level of complexity is a key thing.

And I would say, be aware of the weight of the object you put in VR. You can have a very complex chair, for example. If you have 1,000 of those in your scene, you can imagine that it will affect the performance.

So be aware and conscious of what you put in VR. The same as for texture. For example, you may have one texture with a very big image, very high resolution. If you don't see it, if it don't has value for the experience, prefer to reduce the size, for example.

TOM HUGHES: Thanks. So this section is about workflow, so let's look at some of the workflows we've used to take this kind of content and produce the kind of experiences we'll demo later.

TOBY FOOT: So first of all, I'll talk you through the workflow for InfraWorks models. And the best way to get your data out of InfraWorks is to use the built-in FBX Exporter. And if you're happy with your model, you can just push it straight through. It's a 3ds Max Interactive.

But if you want to add some polish and make it a better experience, you can push it into 3ds Max where you can maybe upgrade the materials, or add in some more assets-- maybe like cars, people-- and maybe upgrade those assets too to better-looking and better-suited models for VR.

And if you've got that data in 3ds Max, then it's a really simple post-process to push it straight into 3ds Max Interactive. There's a built-in tool, and it'll even live link your view, so whatever object you're looking at in 3ds Max, you'll see the same one in 3ds Max Interactive, and see how it looks in the game engine. It's a really neat feature. OK.

So I'll talk to you next about the Revit workflow. And this is an even easier task than the InfraWorks workflow thanks to the Revit Live plug-in. I'll show you a live demonstration-- hopefully it'll work-- of how that functions.

But basically, in the add-ons on Revit, there's a button that says Go Live. You click that. It sends you a model up to the cloud. And with a short amount of processing time-- and I've been speaking with some of the guys who work on Revit Live over the past few days. And I think they've brought the time down by about 10 times, so it's a really quick process now.

Once it's back from the cloud, download the file down to your PC. And then, as a nice intermediary step, you can open up that model in the live viewer and have a little play with it, see it in VR, get a preview before you actually take it to 3ds Max Interactive to create your more curated experience.

So live does all the heavy lifting. It's just working in the cloud in the background. It doesn't tie up the resources of your PC. And a really neat feature is, it's not actually taking the geometry from Revit like you'd have with InfraWorks or any other source. It's replacing those assets with assets that Autodesk have made that are better suited to VR.

So an easy one to spot is the trees in Revit, they're quite plain-looking, quite low res texture. It swaps these out for animated trees where the leaves blow in the wind and stuff. And also, those assets have level of detail.

So if you don't know what is, there's several different models. And when a tree, for instance, is far away, it'll have far less polygons. And as you approach it, the polygon count will go up. You don't see this-- it's like a seamless translation-- but it really reduces the load, as Lionel was saying, on the graphics card.

And then the final step is, you copy the model that's been out pushed to live and open it straight up into 3ds Max Interactive. And then you'll have that same data set that you had in the live editor straight up into Stingray ready for developing your experience. And so you can combine these two workflows, like you saw in the demo with the rail project earlier, to mix your InfraWorks models, your Revit models, and any other sources that you might want to grab an FBX from.

And so we've created a branded VR template that's based on the Max Interactive HTC VIVE template. So when you first load up Max Interactive, you've got a number of templates. Quite a lot of them are gaming-related, so maybe a first-person shooter or something. But there's an HTC VIVE one, and that has all the functionality you need to get straight into VR.

And so we've added some extra features on top of what you get in the vanilla Revit Live experience, the main one being the Mott MacDonald-branded user interface. And we have this, which you might have seen earlier, the meeting room. For this demonstration, we've branded it with Autodesk branding for the occasion, but it would normally be a Mott MacDonald-branded environment.

And that's really nice, because when the user first jumps in, that's the first thing they see. And they're in a familiar surrounding. They're not just jump straight into somewhere like at the top of a skyscraper or something. So you get an impression of the model before you can actually jump into it.

And so you've got a table with an architect-style miniature model there. And we have little map pins like you might see on Google Maps that allow you to jump into the model. So you can have a look around, go, oh, I want to see that bit. I'll jump in there.

And then we've also added information flags, which you can toggle on and off. And you can use these to highlight key parts of your project and draw people's attention to those. And we've also added a markup tool, and this is in early development at the moment. But the idea is that, when we've got collaboration particularly, you can meet with other people, discuss the model, and tag elements that you might want to discuss later.

TOM HUGHES: So this is the combined workflow, but we thought we'd like to show you a live demo within Revit of how easy it is to push model data up to Revit Live. So Toby's just going to start switching over so we can bring Revit up on screen and show you the user interface, and then show you something opening up.

TOBY FOOT: OK. Yeah.

AUDIENCE: Isn't it faster to limit how big of a model you can push?

TOM HUGHES: That's a very good question. We're coming to that later, but yeah.

TOBY FOOT: So here's our Revit model. And here's the Go Live button. I shouldn't have just tried to move that. So you can see here, as I said earlier, that the trees aren't very complex at all. They're just two images like this.

So you hit Go Live-- whoa, whoa, whoa. This is not liking this table. And the nice feature is, it tells you here whether your model's ready, so it'll look a number of things. Key things are like, if you've got any section planes-- also section box switched on, or you've got the wrong visual style switched on, things like that-- it'll just give you a nice flag telling you to change those things about your model. It'll also tell you if there's missing textures, things like that.

And then I recommend that you expand Advanced Options and untick this Extend terrain to horizon. So what this will do is it'll take the edge of your terrain here and extend its horizon on out to mountains and stuff. It is quite a nice look for the model, but for what you're going to be doing with it, taking it into 3ds Max Interactive later, it's data you don't need.

So you'd click go, and I won't actually wait. Here's one I made earlier. But you can see it's starting to upload there. Not going to rely on the Wi-Fi here.

And then what you get out of that is this. And you can see, as I said, the detail of the trees there is really improved. And compared to Revit, the graphical fidelity is excellent.

And you can see down in the bottom, there's your VR button. And this is where you can just jump in and have that quick preview. There's lots of nice little features in here, like you can change time of day. It's quite nice.

And it takes in your viewpoints that you would have in Revit, your camera points. And the BIM data's in there as well. Yeah. That's probably enough of that.

TOM HUGHES: Cool.

AUDIENCE: [INAUDIBLE]

TOBY FOOT: Yes. Yes. So if you do limit the size of your model, that is what will get processed.

AUDIENCE: [INAUDIBLE]

TOBY FOOT: Exactly, yeah.

AUDIENCE: [INAUDIBLE]

TOBY FOOT: Yeah, that's a key technique that we used with really large projects to make the performance better.

TOM HUGHES: [INAUDIBLE]

TOBY FOOT: Oh, sorry.

TOM HUGHES: That's OK. Unfortunately, the Alienware laptops, while they're really powerful and have a great graphics card, do not have many USB sockets. So we're having to switch between the mouse and the clicker.

So a question earlier about the size of models. So we were using this on the early days of Expo. And then as we moved into Revit Live, and particularly for the AGM projects, one of the things that we found basically producing models from other people's projects is, we can't always control the models that we get.

As an AEC business like ours, we design some really big things. So stadia, airports, hospitals tend to be fairly big. But these big things have a tendency to cause Revit Live some problems, or at least when we first started.

Actually, we're known by Nicolas Fonta, who's the product manager for Live, because we broke his server by uploading 8 gigabytes of data three times in one day. So that's the fairly intimidating way. So, sorry. Sorry, the camera caught me off-guard there. Where are we?

TOBY FOOT: You're still-- back one.

TOM HUGHES: Back one more. Oh, yeah. Cool, OK. So you've just seen the solution. I'll just go back to the model.

So yeah. Basically, we have one of our projects. We're not going to have time to show you that project as a live demo, but we did a stadium. And that stadium was 8 gigabytes of data that we were trying to get to live.

At that time, we just couldn't get it to work. We broke Nick's server. He didn't get annoyed at us, but was really interested about what we were doing, because he needs to know what kind of models are going in. And we're not the typical architects' models that are the size of a house. We're the size of a stadium, so there we go.

We have got a simple workaround solution, so where we do get users with large models and we need to find a way through that. It's not rocket science. We can either subdivide the model and upload that as two or three pieces to Revit Live, or we can actually go direct into 3ds Max and then into Max Interactive that way.

The drawback, perhaps, of that route is you don't get the processing of Revit Live. So the kind of function Toby talked about in terms of the improvements that Revit Live is doing automatically, you lose out on that a little bit. But really, if you're thinking about dealing with a large stadium, you can maybe think that you bulk upload the big piece, and then think about the experience for a user, and do some small submodels for those parts of the level design.

TOBY FOOT: So we often get IFC files from architects normally. And a problem with IFCs in Revit is they just come through with no materials, just a big gray mass. And this isn't what we want to be presenting to our clients in the Max experience. So--

TOM HUGHES: There they are. How do we solve the IFC problem?

LIONEL GRAF: What we can say is that one of the workaround we can put in place to overcome those kind of challenges is to, first, to bind your IFC link file into your Revit. So the thing is that, Live will not be able to leverage all the information that are contained in the IFC if it's only a link file somewhere. So it's an element that is added to your Revit project. But if you bind this into your Revit project, Live would get access to all the metadata that are contained in the IFC file.

The trickiest thing is the materials. So right now, we don't really have a solution to deal with the materials. Again, if this IFC file is binding to the Revit model, you will have access to the graphic definition or properties of the object. But Live will need to leverage what is in the appearance tab in Revit. Basically, what you will get if you render the Revits using the cloud, for example.

And this is not something that come through with the IFC. So for materials, for the moment, you will need to leverage something like 3ds Max, or 3ds Max Interactive directly to tweak those materials afterwards when you get those data into your VR environment.

TOM HUGHES: So I believe the model that we all live demoing today is one of the models that a large portion of the content in there came to us via IFC. So you get to see how we've worked with IFC, and how we're delivering VR from an IFC base.

TOBY FOOT: Yeah. The concrete is Revit, and then I think everything-- oh, and steelwork. Everything else is IFC.

Right. So a really important factor to consider when you're making a VR experience is locomotion, i.e., how you're going to get around. And also performance. A good locomotion experience is absolutely critical in VR.

And bad locomotion can have really bad side effects, make people nauseous, and put them off VR. And it's something I get a lot when I'm trying to demo VR to people. They'll say, oh, no, I don't want to try it. I had a bad experience before.

So we tried several techniques for locomotion. And one of the main tips we have is-- even though you'll actually see it in games-- it's not a nice experience to have rotation that's not controlled by the headset. If I'm looking over here, you don't want to be able to push a button and then make my view change without me moving my head, because it really disrupts your sense of direction.

So there's a few methods that we recommend for being best practice. Teleportation is the main one. And this is something that's used, again, in a lot of games, where-- you'll see it later-- you push a button on the controller, you'll get an arc coming out of the end of your controller like you're throwing a ball. And where that arc ends is where you'll jump to. It's really intuitive. As soon as someone presses that button, it's pretty obvious what's going to happen.

The next one is what I talked about earlier with the map pins. So points that you want to highlight, points you want to say, this is a good place, this is a nice view, this is where you'll see the part of the model we want you to see. So place those points at nice points of interest around your model.

And then, so you can leverage that two ways. You can either have it so you're jumping around when you're actually in the model, or like I was talking about earlier when the models on the table, use it to jump from the scale-- the way you're looking over the model-- to the human scale when you're inside the model.

And then the last type of locomotion-- which, to be honest, is probably used less-- is using some sort of vehicle. So you might be driving in a car. You might be on a Segway or something. Something that makes the user feel that the locomotion's OK. They're not just being dragged along. There's something to ground their sense of movement.

Things that we really don't recommend is free movement. So like you'd have in a normal first-person shooter, pushing a direction to move without actually the player moving. And flying. So taking off, flying out around the model in first person can be a really nauseating experience.

And of course, actually, I forgot to do one form of locomotion that's probably the most obvious. Walking around is the absolute best way to move around in your model. You're obviously limited. Here, we've only got a 2-by-2 meter space for the room scale. What's that, about 6 feet or something.

But it can go up to 5-by-5 meters, and that's actually a pretty decent space. And you can really play with that a lot. We made that Mott MacDonald-branded room the same size, 5 by 5 meters. So when you've got that maximum play space, you never need to teleport. You can just walk around absolutely freely. OK.

TOM HUGHES: And we're doing a really quick demo, aren't we, on some of the locomotion just so you can see. It's often hard to conceptualize some of those things just looking at a screen, talking about them. It's easy to start to see--

TOBY FOOT: So you're going to narrate with me.

TOM HUGHES: Yeah. We good to go?

TOBY FOOT: Should be. Can you hold the mic?

TOM HUGHES: Yeah, of course. It's the bit you've all been waiting for, the live demo.

TOBY FOOT: Well, I hope I've set the boundaries up right. Otherwise, I'm going to be walking into stuff.

TOM HUGHES: Cool.

TOBY FOOT: The other one way around. OK. So here, you can see the branded room, as we talked about, and the model that's on the table. So I can show you a bit about our UI here.

TOM HUGHES: Actually, I think we cover that later.

TOBY FOOT: We do. But I need to use it to access the viewpoints. So this is the POI, the Point of Interest viewpoints I was talking about. So you have a laser on your controller. You select the one you want to go to. There we go. And yeah, you'll jump in.

And so the next point of locomotion emotional you is that teleportation. So here you go. Can you see the arc? And you might be able to see the white square as well.

And if I walk around, you'll see that I'm walking around in that square. So that shows you your actual room that you're standing in. So if I stick that there, I'll be walking out over the edge. So there we go.

And you've also got the map pins like you did in the table scale to jump around. And now I'm outside. OK.

TOM HUGHES: Really cool. Yeah, thanks. We'll come back to the model in a little bit later. We'll show you all the functions that we've been building with Lionel once we've gone through some of the challenges that we faced early on with our approach.

TOBY FOOT: OK. So as I mentioned earlier, another factor that's really important to get experience is performance. It can really affect users in the same way that bad locomotion can. Make you nauseous, make for a bad impression.

So the recommended minimum frame rate that we want to hit is 90 frames per second. And that's the refresh rate of the VIVE, so matching that will give you a nice, smooth experience. It was like Lionel talked about earlier, there's a number of ways that you can get a good frame rate.

And 3ds Max Interactive has a lot of these functions built in. One would be baking lights. And if you don't know about game development, basically, baking lights is-- you'll see here where the lights shine on the wall.

If that was a real-time light, then you'd get shadows and it'll be actually calculating the photons in real time. But if we're not going to actually be able to cast a shadow there, we can just create that texture with the lights already shown on the texture. And that's what baking the lights is.

And that really reduces the performance hit. Especially in a large model like that stadium model, we had thousands and thousands of lights in that model. And there was just absolutely no way even out of VR that you could get the kind of performance you need.

And like we talked about earlier, geometry is also a big factor. So like Lionel said, in that stadium model, we had thousands and thousands of seats. And if that's a high-polygon model, you're just creating massive amounts of complexity.

And often, there's no real need. You can get away with something much less. And things that are hidden, like ductwork, it's easy just to switch those off. OK.

TOM HUGHES: Cool. So one of the things we wanted to talk about was brand identity. Clearly, when we looked at the user interface of our first Stingray project, we'd got a fixed UI, so that Mott MacDonald brand was evident all over the place.

I suppose you have to go back and ask why brand identity is important. And for me, it's how your customer knows-- or your customer, and also your customer's customer knows who you are. There's a few ways to think about brand identity within VR.

But we trialed some early approaches with perhaps a floating UI where we've got Mott MacDonald visible within the model, or putting Mott MacDonald-branded assets within the client's model space. We felt it was a little bit of an intrusion into the client's experience.

So we worked with Mott to come up with some concepts of how we could do this differently. And you can see the Mott MacDonald space there, and like Toby said today, for today's UI, we've branded that with Autodesk space. Clearly, when we're working in joint venture, or we're working for a client specifically, because this is just a VR space, we can just change out the identity within this model of a concept of a room-scale model with a table on top.

You start your experience in that room, so you know who you're working with, what the project context is. You also can use that room as an opportunity to present key project facts on a wall. So there's lots of things you can do in that room space before you then enter the model, actually experience things at human scale.

So we've used this. And we're finding that this is a really, really good way of thinking about setting the scene of your experience before being in your experience, which is a really important point when it comes to user experience within virtual reality.

AUDIENCE: When did you set that room experience in each model? When was this--

TOM HUGHES: So that's its own model within the level design. So we start the experience there, and we allow the users to then jump into a new level to start their experience at human scale.

AUDIENCE: Thank you.

TOM HUGHES: So Lionel's going to give a little bit of an introduction in terms of how we approach the actual in-model user interface, and how we dealt with some of the challenges around that.

LIONEL GRAF: So as Tom said, one of the key task in this AGM project was to showcase a brand. And this VR room with the branded [INAUDIBLE] on the wall is a good introduction. But as soon as you jump into the human scale level, you are hiding all the stuff.

So the VR menu-- so we developed a kind of UI that is attached to the controller. So you saw briefly that with Toby before. And basically, you are holding all the function with you. So as soon as you press a button, it will show up a menu that gives you access to some functions. So it can be basic function, like toggling the viewpoints, or things more complicated like taking measurements or--

TOM HUGHES: So when we do the live demo, we hope we get to show all of the functions that we built in our [INAUDIBLE].

LIONEL GRAF: But it was something quite interesting to have, or to carry the Mott MacDonald brand as well.

TOM HUGHES: So we have one more slide before we're going to do our full VR demo. So Toby's just going to talk for a second about the human scale experience. So that's when you are within the model at the full scale of the asset.

TOBY FOOT: So as Tom was saying, the human scale level, it's about actually experiencing the project, and experiencing it from a first person, and trying to really give the user as much of an authentic experience of what it's like to be in that place as possible. So we don't want any branding in that space. We want it to be pure, just seeing what the actual Revit model is, or whatever model. Because it's about the client's project in this space. It's not about us.

And so a nice feature that we had in here is you saw those pins. When you're placing them on the table, they automatically link to the same locations in the human scale. So that's just a little bit of that Flow code to be able to link those together. So you've got a nice, seamless experience. You can jump around to the same points, and you don't have to try and manually line them up for consistency between the two.

TOM HUGHES: OK. So we're going to try to show you as many of these experiences as we can. Obviously, it's taking a bit of time in terms of changing the model and things, so we're running short-term. But Toby's going to quickly jump into VR, and we're going to start to see what we've built with Lionel over the last, I suppose, eight or so months. Not continuous working, but as we've identified need to do different features within different models.

One of the things that's really key to understand about these features is, although we're building them into a specific experience because it's a specific requirement, we're actually building a library of functions. So when we start a new project with a new client, we know that you've got these functions. We can bring those in because they're totally reusable bits of code within the Flow diagrams. We just bring that in.

So it's not every time we do it, we're building these things from scratch. It's like modules of just, here's a functionality that sits within the template. That was weird.

[CHUCKLING]

So here we are in our model. Toby's currently in the room scale, and just able to look around. One of the first things you'll do is load up the UI and then look to place the information flags on the table.

So you can see the UI there, selecting a flag, and you can place a flag on the model and see how it's tracking Toby as he walks around. We can populate these flags with bits of project information, preset them so that you can understand what you're looking at in that architectural table space mode.

The next thing you're likely to do is probably want to actually see things a little bit closer. So you can go from this room scale and jump into human scale. There we go. So you're now inside the model.

Here, you can see the teleportation, just jumping around. This allows you to move forward, so you're still within the room scale of the VIVE environment, but you can then walk around and then jump to the next place.

We found that of method of locomotion was the best experience for the user. We found it was less likely to cause any sort of nausea. But you can do some pretty big jumps. So yeah, you can still see you can get quite far there.

OK, so yeah. Let's have a look at the room data. So one of the things about working with live is we're bringing data through from Revit. And you can see in the UI that we've got access to some of the parameters that are contained within the optics.

We do some work to identify what we prioritize in terms of the display. We couldn't show all the parameters, because clearly, that would overwhelm you. Cool. It's nice. It's working well. We got through there.

So this is something that Lionel's been working on with us most recently. And this is changing materials. It's really just textures, so we're not really changing anything in the Revit model. But it allows you to walk through live with a client and look at what different finishes might look like just by painting your model in virtual reality.

TOBY FOOT: You can make it out of glass.

TOM HUGHES: Oh, wow. So I'd not seen that one before. That's really cool. One of the things with this, at the moment, this isn't a round-trip workflow, so we're not pushing any of this data back. But clearly, as we start to mature our functions, that is something that we've already started to have discussions around. So you can see what this might look like if you were to see it again this time next year.

What about dimensioning? So we're not talking about detail design reviews, but you might want to check a few key dimensions. Maybe something doesn't quite look right. Well, here's point-to-point dimensions live within a VR environment. You see how quick that is? You can do anywhere in the model.

TOBY FOOT: And this is orthographic. So it'll just send a beam straight out and do 90-degree--

TOM HUGHES: Oh, yeah, of course. Yeah, yeah, yeah.

TOBY FOOT: And then if you want to delete them, you can just click on it.

TOM HUGHES: So one of the things that we've been thinking about use cases for something like this could be perhaps doing inspections, but virtual inspections to measure out dimensions. What are we going to go to now?

TOBY FOOT: Issue.

TOM HUGHES: Oh, of course. So this is, yeah, an ability to just spawn objects. So we were thinking about spawning. We've seen quite a few people spawning chairs, tables, that kind of thing. But we were thinking, we like to do design reviews where we want to identify what might be a potential risk, potential hazard.

We've been talking again about opportunities that we could perhaps enhance that. So not only just a marker but perhaps take some text with some voice-to-text and start tagging some detail there. So I'm thinking about some of those in the future too.

TOBY FOOT: And you'll see that they scale, so.

TOM HUGHES: Oh, yeah, of course.

TOBY FOOT: It's quite big there. But as I go up, it scales down, so it's not intrusive.

TOM HUGHES: So there are some of the things we've had to think about in terms of working with VR and making the user experience responsive. And it takes a little bit of thought, but actually, all of the coding for that's within those Flow nodes. So it's not us having to code. It's us just thinking of how we use those nodes and get the right kind of experience.

TOBY FOOT: And then last but not least, if we push the Home button.

TOM HUGHES: Oh, yeah, of course. How do I get back out? Oh, yeah, I push Home and I'm back in my branded room. So I think that's our live demo of what we've done. I hope it's-- yeah.

[APPLAUSE]

Very kind, thank you. So I'll just let Toby switch back to the last few slides in our presentation.

TOBY FOOT: How we doing?

TOM HUGHES: Yeah, I'll just skip through the last-- so the project template is basically, once we've got this template, we can store this. We can use it on any project we want. These are the bits that build the template, but it's relatively straightforward.

So the very last bit, what's our successes. So really, this was a story about the approach for the AGM. Really, success for us, we hit the deadline for the AGM. We had six of these models available, so that was for all of our global sectors. Transport models, building models, water resource models. That's really down to the hard work of Toby.

And then, also, that's a thank you for me for making that happen. We wowed our stakeholders. And we've engaged with countless project leaders in that space, so they came to us with our own needs for VR experiences, their own ideas about functions we could build in the future. So having this visible what we can do has then allowed us to really think about what we can do next.

I guess the other successes for us has been-- we've launched within Mott MacDonald now a VR portal where anyone within Mott MacDonald can come to and find information about VR. And as Toby mentioned earlier, we've gone from having zero to, within the space of four months, 13 HTC VIVE headsets where offices are seeing the opportunity purchasing now-- 14. Have we got another one? Some of our colleagues here in the front row.

But that's not just-- so we've got places like Hong Kong and Singapore. We've also got Toby traveled out to one of our offices in India to look at how VR could be used for collaboration on design between design centers and designers either in the UK or in the US. So that's one of the things we're thinking is a real big opportunity for us.

And yeah, I guess that last success is the opportunity of events like this. But also things like the National Infrastructure Forum in the UK. We had a similar sort of setup and we were presenting in the exhibition hall there. So because we can demonstrate that we can do this, we've got the opportunity to show this to a lot of people now, which is really, really good.

So we wanted to just recap the tips and tricks, because we know that you've seen this, and you really want to try this now. So we're keen that you come away from here enthusiastic, but also with the right knowledge to actually do this yourself.

TOBY FOOT: We're pretty much out of time, so I think probably what would be best is you can read these in more detail afterwards. But just the key one is reiterate the frame rate is absolutely the number one thing you should consider when you're creating your experience.

TOM HUGHES: And I think we have discussed most of the things that we are really thinking about about how we move this forward for next time. And I think that's something I'd just like to remind you. So I know everyone that's here will get a reminder to rate the class. And if you want to see us here next year talking about what we're doing in the next 12 months, just make sure you rate the class, because that's how we get back here next year, so we'd really appreciate that.

All right. That's it, yeah. So thank you very much.

[APPLAUSE]

______
icon-svg-close-thick

Cookie preferences

Your privacy is important to us and so is an optimal experience. To help us customize information and build applications, we collect data about your use of this site.

May we collect and use your data?

Learn more about the Third Party Services we use and our Privacy Statement.

Strictly necessary – required for our site to work and to provide services to you

These cookies allow us to record your preferences or login information, respond to your requests or fulfill items in your shopping cart.

Improve your experience – allows us to show you what is relevant to you

These cookies enable us to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we use to deliver information and experiences tailored to you. If you do not allow these cookies, some or all of these services may not be available for you.

Customize your advertising – permits us to offer targeted advertising to you

These cookies collect data about you based on your activities and interests in order to show you relevant ads and to track effectiveness. By collecting this data, the ads you see will be more tailored to your interests. If you do not allow these cookies, you will experience less targeted advertising.

icon-svg-close-thick

THIRD PARTY SERVICES

Learn more about the Third-Party Services we use in each category, and how we use the data we collect from you online.

icon-svg-hide-thick

icon-svg-show-thick

Strictly necessary – required for our site to work and to provide services to you

Qualtrics
We use Qualtrics to let you give us feedback via surveys or online forms. You may be randomly selected to participate in a survey, or you can actively decide to give us feedback. We collect data to better understand what actions you took before filling out a survey. This helps us troubleshoot issues you may have experienced. Qualtrics Privacy Policy
Akamai mPulse
We use Akamai mPulse to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Akamai mPulse Privacy Policy
Digital River
We use Digital River to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Digital River Privacy Policy
Dynatrace
We use Dynatrace to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Dynatrace Privacy Policy
Khoros
We use Khoros to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Khoros Privacy Policy
Launch Darkly
We use Launch Darkly to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Launch Darkly Privacy Policy
New Relic
We use New Relic to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. New Relic Privacy Policy
Salesforce Live Agent
We use Salesforce Live Agent to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Salesforce Live Agent Privacy Policy
Wistia
We use Wistia to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Wistia Privacy Policy
Tealium
We use Tealium to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Tealium Privacy Policy
Upsellit
We use Upsellit to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Upsellit Privacy Policy
CJ Affiliates
We use CJ Affiliates to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. CJ Affiliates Privacy Policy
Commission Factory
We use Commission Factory to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Commission Factory Privacy Policy
Google Analytics (Strictly Necessary)
We use Google Analytics (Strictly Necessary) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Strictly Necessary) Privacy Policy
Typepad Stats
We use Typepad Stats to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. Typepad Stats Privacy Policy
Geo Targetly
We use Geo Targetly to direct website visitors to the most appropriate web page and/or serve tailored content based on their location. Geo Targetly uses the IP address of a website visitor to determine the approximate location of the visitor’s device. This helps ensure that the visitor views content in their (most likely) local language.Geo Targetly Privacy Policy
SpeedCurve
We use SpeedCurve to monitor and measure the performance of your website experience by measuring web page load times as well as the responsiveness of subsequent elements such as images, scripts, and text.SpeedCurve Privacy Policy
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Improve your experience – allows us to show you what is relevant to you

Google Optimize
We use Google Optimize to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Google Optimize Privacy Policy
ClickTale
We use ClickTale to better understand where you may encounter difficulties with our sites. We use session recording to help us see how you interact with our sites, including any elements on our pages. Your Personally Identifiable Information is masked and is not collected. ClickTale Privacy Policy
OneSignal
We use OneSignal to deploy digital advertising on sites supported by OneSignal. Ads are based on both OneSignal data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that OneSignal has collected from you. We use the data that we provide to OneSignal to better customize your digital advertising experience and present you with more relevant ads. OneSignal Privacy Policy
Optimizely
We use Optimizely to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Optimizely Privacy Policy
Amplitude
We use Amplitude to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Amplitude Privacy Policy
Snowplow
We use Snowplow to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Snowplow Privacy Policy
UserVoice
We use UserVoice to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. UserVoice Privacy Policy
Clearbit
Clearbit allows real-time data enrichment to provide a personalized and relevant experience to our customers. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID.Clearbit Privacy Policy
YouTube
YouTube is a video sharing platform which allows users to view and share embedded videos on our websites. YouTube provides viewership metrics on video performance. YouTube Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Customize your advertising – permits us to offer targeted advertising to you

Adobe Analytics
We use Adobe Analytics to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Adobe Analytics Privacy Policy
Google Analytics (Web Analytics)
We use Google Analytics (Web Analytics) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Web Analytics) Privacy Policy
AdWords
We use AdWords to deploy digital advertising on sites supported by AdWords. Ads are based on both AdWords data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AdWords has collected from you. We use the data that we provide to AdWords to better customize your digital advertising experience and present you with more relevant ads. AdWords Privacy Policy
Marketo
We use Marketo to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. We may combine this data with data collected from other sources to offer you improved sales or customer service experiences, as well as more relevant content based on advanced analytics processing. Marketo Privacy Policy
Doubleclick
We use Doubleclick to deploy digital advertising on sites supported by Doubleclick. Ads are based on both Doubleclick data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Doubleclick has collected from you. We use the data that we provide to Doubleclick to better customize your digital advertising experience and present you with more relevant ads. Doubleclick Privacy Policy
HubSpot
We use HubSpot to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. HubSpot Privacy Policy
Twitter
We use Twitter to deploy digital advertising on sites supported by Twitter. Ads are based on both Twitter data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Twitter has collected from you. We use the data that we provide to Twitter to better customize your digital advertising experience and present you with more relevant ads. Twitter Privacy Policy
Facebook
We use Facebook to deploy digital advertising on sites supported by Facebook. Ads are based on both Facebook data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Facebook has collected from you. We use the data that we provide to Facebook to better customize your digital advertising experience and present you with more relevant ads. Facebook Privacy Policy
LinkedIn
We use LinkedIn to deploy digital advertising on sites supported by LinkedIn. Ads are based on both LinkedIn data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that LinkedIn has collected from you. We use the data that we provide to LinkedIn to better customize your digital advertising experience and present you with more relevant ads. LinkedIn Privacy Policy
Yahoo! Japan
We use Yahoo! Japan to deploy digital advertising on sites supported by Yahoo! Japan. Ads are based on both Yahoo! Japan data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Yahoo! Japan has collected from you. We use the data that we provide to Yahoo! Japan to better customize your digital advertising experience and present you with more relevant ads. Yahoo! Japan Privacy Policy
Naver
We use Naver to deploy digital advertising on sites supported by Naver. Ads are based on both Naver data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Naver has collected from you. We use the data that we provide to Naver to better customize your digital advertising experience and present you with more relevant ads. Naver Privacy Policy
Quantcast
We use Quantcast to deploy digital advertising on sites supported by Quantcast. Ads are based on both Quantcast data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Quantcast has collected from you. We use the data that we provide to Quantcast to better customize your digital advertising experience and present you with more relevant ads. Quantcast Privacy Policy
Call Tracking
We use Call Tracking to provide customized phone numbers for our campaigns. This gives you faster access to our agents and helps us more accurately evaluate our performance. We may collect data about your behavior on our sites based on the phone number provided. Call Tracking Privacy Policy
Wunderkind
We use Wunderkind to deploy digital advertising on sites supported by Wunderkind. Ads are based on both Wunderkind data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Wunderkind has collected from you. We use the data that we provide to Wunderkind to better customize your digital advertising experience and present you with more relevant ads. Wunderkind Privacy Policy
ADC Media
We use ADC Media to deploy digital advertising on sites supported by ADC Media. Ads are based on both ADC Media data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that ADC Media has collected from you. We use the data that we provide to ADC Media to better customize your digital advertising experience and present you with more relevant ads. ADC Media Privacy Policy
AgrantSEM
We use AgrantSEM to deploy digital advertising on sites supported by AgrantSEM. Ads are based on both AgrantSEM data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AgrantSEM has collected from you. We use the data that we provide to AgrantSEM to better customize your digital advertising experience and present you with more relevant ads. AgrantSEM Privacy Policy
Bidtellect
We use Bidtellect to deploy digital advertising on sites supported by Bidtellect. Ads are based on both Bidtellect data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bidtellect has collected from you. We use the data that we provide to Bidtellect to better customize your digital advertising experience and present you with more relevant ads. Bidtellect Privacy Policy
Bing
We use Bing to deploy digital advertising on sites supported by Bing. Ads are based on both Bing data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bing has collected from you. We use the data that we provide to Bing to better customize your digital advertising experience and present you with more relevant ads. Bing Privacy Policy
G2Crowd
We use G2Crowd to deploy digital advertising on sites supported by G2Crowd. Ads are based on both G2Crowd data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that G2Crowd has collected from you. We use the data that we provide to G2Crowd to better customize your digital advertising experience and present you with more relevant ads. G2Crowd Privacy Policy
NMPI Display
We use NMPI Display to deploy digital advertising on sites supported by NMPI Display. Ads are based on both NMPI Display data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that NMPI Display has collected from you. We use the data that we provide to NMPI Display to better customize your digital advertising experience and present you with more relevant ads. NMPI Display Privacy Policy
VK
We use VK to deploy digital advertising on sites supported by VK. Ads are based on both VK data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that VK has collected from you. We use the data that we provide to VK to better customize your digital advertising experience and present you with more relevant ads. VK Privacy Policy
Adobe Target
We use Adobe Target to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Adobe Target Privacy Policy
Google Analytics (Advertising)
We use Google Analytics (Advertising) to deploy digital advertising on sites supported by Google Analytics (Advertising). Ads are based on both Google Analytics (Advertising) data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Google Analytics (Advertising) has collected from you. We use the data that we provide to Google Analytics (Advertising) to better customize your digital advertising experience and present you with more relevant ads. Google Analytics (Advertising) Privacy Policy
Trendkite
We use Trendkite to deploy digital advertising on sites supported by Trendkite. Ads are based on both Trendkite data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Trendkite has collected from you. We use the data that we provide to Trendkite to better customize your digital advertising experience and present you with more relevant ads. Trendkite Privacy Policy
Hotjar
We use Hotjar to deploy digital advertising on sites supported by Hotjar. Ads are based on both Hotjar data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Hotjar has collected from you. We use the data that we provide to Hotjar to better customize your digital advertising experience and present you with more relevant ads. Hotjar Privacy Policy
6 Sense
We use 6 Sense to deploy digital advertising on sites supported by 6 Sense. Ads are based on both 6 Sense data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that 6 Sense has collected from you. We use the data that we provide to 6 Sense to better customize your digital advertising experience and present you with more relevant ads. 6 Sense Privacy Policy
Terminus
We use Terminus to deploy digital advertising on sites supported by Terminus. Ads are based on both Terminus data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Terminus has collected from you. We use the data that we provide to Terminus to better customize your digital advertising experience and present you with more relevant ads. Terminus Privacy Policy
StackAdapt
We use StackAdapt to deploy digital advertising on sites supported by StackAdapt. Ads are based on both StackAdapt data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that StackAdapt has collected from you. We use the data that we provide to StackAdapt to better customize your digital advertising experience and present you with more relevant ads. StackAdapt Privacy Policy
The Trade Desk
We use The Trade Desk to deploy digital advertising on sites supported by The Trade Desk. Ads are based on both The Trade Desk data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that The Trade Desk has collected from you. We use the data that we provide to The Trade Desk to better customize your digital advertising experience and present you with more relevant ads. The Trade Desk Privacy Policy
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

Are you sure you want a less customized experience?

We can access your data only if you select "yes" for the categories on the previous screen. This lets us tailor our marketing so that it's more relevant for you. You can change your settings at any time by visiting our privacy statement

Your experience. Your choice.

We care about your privacy. The data we collect helps us understand how you use our products, what information you might be interested in, and what we can improve to make your engagement with Autodesk more rewarding.

May we collect and use your data to tailor your experience?

Explore the benefits of a customized experience by managing your privacy settings for this site or visit our Privacy Statement to learn more about your options.