AU Class
AU Class
class - AU

How to Wow – Extending BIM into Virtual Reality for every project

이 강의 공유하기

설명

Do you want to extend your workflows and use BIM models to create VR experiences that will wow your clients and project stakeholders?

This class will cover how Mott MacDonald, working with Autodesk consultancy, have been developing our own VR capability using Autodesk technologies.

This class is aimed at architects and engineers with a BIM background but little or no VR experience. We believe VR can add value to every project type and we will give you an in-depth look at the processes and workflows we have used to create VR experiences for Buildings, Transportation, Energy and Water projects.



We will start with the basics; VR hardware and how to prepare BIM models for VR using Stingray, Revit Live and 3DS Max. Fitting for Las Vegas, we will be 'upping the ante' with live demonstrations so we can share with you the variety ways we have brought our designs to life using video game technology, and how this is all possible without writing a single line of code.

주요 학습

  • Explain how VR can wow stakeholders, bring design to life, tell compelling narratives and add value to all project types
  • Use Autodesk Stingray, Revit Live, 3DS Max and Infraworks 360 to create their own VR experiences
  • Follow some basic tips and tricks to optimize their BIM models for use downstream in VR
  • Understand key VR concepts including; hardware selection, user interface, locomotion and performance

발표자

  • Tom Hughes 님의 아바타
    Tom Hughes
    I am a Civil Engineer that followed a passion of using technology to do my own job more efficiently into a career of helping others to do the same. After working on a variety of infrastructure projects during the early days of Mott MacDonald's BIM strategy and UK BIM Level 2, I joined a small team whose primary focus was the application of leading technology on engineering projects. As the small team has grown into a global network, I have the privilege to work with project teams and digital leaders from around Mott MacDonald. As part of my role I regularly get the opportunity to work closely with both our Autodesk account team and the Autodesk product teams. I am member of the Autodesk Civil Infrastructure Futures Community, the BIM 360 Customer Council, and I coordinate the Mott MacDonald Customer Success Plan. Alongside my role at Mott MacDonald, I am the delivery lead for the Centre for Digital Built Britain Digital Twin Hub (https://digitaltwinhub.co.uk). The DT Hub is an online community for people who are working to deliver digital twins and share a vision of an ecosystem of interconnected digital twins at national scale. Away from work I like to surf, bike, Xbox, and binge watch comedy.
  • Lionel Graf
    Lionel Graf is an implementation consultant with the Automotive Consulting Team at Autodesk, Inc. He has been working for over 14 years in the rail industry as a creative design production manager. He specializes in creative design visualization and communication, with a deep knowledge of real-time technologies and processes to achieve high-end visual quality for aesthetic design communication and real-time design reviews using VRED software, ALIAS software, MAYA software, 3ds Max software, and creative field standard software for image creation.
  • Toby Foot
    Toby Foot is a digital project specialist for Mott MacDonald as part of their global digital team. With over 10 years experience as a BIM technician specialising in Revit and Civil 3D, he has been at the forefront of delivering intelligent 3D models on high profile engineering projects worldwide. In recent years Toby has lead the development of immersive technology experiences from BIM models, particularly VR, through Autodesk’s new gaming engine; Stingray.
Video Player is loading.
Current Time 0:00
Duration 1:02:54
Loaded: 0.26%
Stream Type LIVE
Remaining Time 1:02:54
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

TOM HUGHES: The key word for today's presentation is storytelling. I love this graphic. The complexity of a good story is really depicted here.

When I was training to be a civil engineer, words like storytelling and design narrative were not part of the agenda. Emerging technology, virtual reality, and maximum interactive allows us to tell more compelling stories.

So I'm Tom Hughes. I work for Mott MacDonald. I'm a senior consultant with the VR digital leadership team. I've 10 years experience. And really, it's all about helping projects to be successful.

I'm an infrastructure BIM guy, so Civil 3D and InfraWorks. I focus on the collaborative management of information. The last year or so, I've been working with Toby and Lionel in extending some of those workflows and using BIM to our clients.

A little fact about myself, I once cooked a crawfish boil for 800 people, which was slightly less scary than this.

[LAUGHTER]

And I've got to say, I was so impressed by the presentation Skanksa gave on the main stage during the technology keynote yesterday. I'm maybe a little bit jealous, but relieved I wasn't on that stage. But also, great minds think alike. We're not going to be the only company doing VR. And we've got the opportunity to show you some of our experiences live in this forum, so you'll see actually what we've been doing, not only how we've been delivering them.

I don't know if any of the University of Amsterdam students are in here today, but I met them at lunch yesterday and I was really impressed with their enthusiasm for this kind of technology. And at such a young age, they were already fully aware of what VR could be doing for construction. So I think a real sign of the future.

TOBY FOOT: Hi, I'm Toby, project specialist at Mott MacDonald. Like Tom, I've also got 10 years industry experience. So before becoming a project specialist early this year, I had over 10 years experience as a BIM technician specializing in Revit and Civil3D.

And now part of my role as a project specialist is leading the development of our immersive technology. So that's the gamification of BIM models and VR like we're going to show you today. And a little fact about myself is, I'm a king gamer, hence all this stuff. But since becoming a dad last year and the pressures of the new job, I don't have any chance to play the massive amount of games I've accrued over my years. Lionel.

LIONEL GRAF: Hi, I'm Lionel Graf, so I'm part of visual consulting. So I help Toby and Tom on this particular project. So at Autodesk Consulting, I'm leading a small team of visualization specialists in Europe. And yeah.

So basically, game engines, 3ds Max Interactives, Stingray, it's the same. 3ds Max, Maya, VRED, so a lot of solutions. So our mission is basically to help our customer get their head around all the technology we have at Autodesk, how we can leverage that to take the most of it, and yeah. Find the right value for the right purpose, and support the challenges that we may-- or our customer like Tom and Toby may face.

TOM HUGHES: So Lionel is going to be the one that answers all of yours questions later. So this class is titled, How to Wow. So we've already discussed about using virtual reality to extend traditional BIM workflows. Linking back to the key word for this presentation is really a story about how we've done all the work with Autodesk to develop a repeatable process to allow us to use virtual reality on every project.

One of the things that we were very aware of with virtual reality was, we didn't want something that was just off the peg. And we didn't want something that was just a one-use solution. So it's one of the main reasons why we work with Autodesk to think about how we could do something that was a continually-developing template. We're going to show a bit of that as we go through today.

So a little bit about Mott MacDonald, only one slide. We're an employer and engineering management and development consultancy. We focus our expertise through global business sectors to make a difference on big issues. You can see the scale of our business there. So about 16,000 employees globally.

Let's cover our class learning objectives, because that's why you're here. Objective one is to explain how VR can wow stakeholders, bring design to life, and tell compelling narratives and add value to all project types. Objective two is to use Autodesk Max Interactive, Revit Live, 3ds Max, and InfraWorks 360 to create your own virtual reality experience. Objective three, if you're going to do objective two, is to follow some basic tips and tricks to optimize your models for use downstream in VR. Finally, objective four, to understand the key VR concepts including hardware selection, user interface, locomotion, and model performance.

So let's look at the agenda. Number one, we're going to look at our first steps using Stingray. Number two, we're going to look at the challenges that we faced. Number three, we're going to look at the success that we've had. And finally, four, we're going to give you some tips, tricks, and future plans of what we might be working on with Lionel next year.

So if I was an Autodesk employee, I'd be talking about a safe harbor statement here. There's absolutely nothing in this presentation that you can't share with your colleagues, and we really encourage you to do so. But COA is covering our ass.

So we are going to be doing live VR models. They may not work. They definitely will, because Lionel's been working tirelessly this week to get them up to scratch.

Also, when we first started this journey, Max Interactive was called Stingray. And quite often, we use the terms interchangeably. Don't feel confused when we say Stingray. What we really mean is Max Interactive.

Finally, the only other caveat about this presentation is the 90 frames per second that Toby will receive in his head-mounted display is not the same as we can mirror onto the screen. So don't be put off by any lag in the experience here, because what Toby's experiencing is much smoother, much more intuitive.

So let's start with part one, our first steps using Autodesk Stingray. Our journey with Lionel actually started two years ago at Autodesk University 2015. I attended a small class about something called Project Expo, which later became Revit Live.

So transportation, how people get from A to B, is of critical importance and a key issue within society. One of the projects that Mott MacDonald are currently working on is the Crossrail line in London. As part of the Crossrail strategy, all of the stations among a route are being upgraded so they're an access-for-all station providing step-free access from curb to platform.

Oh. Went too far. Sorry.

So one of the reasons that we wanted to look at a project such as one of these stations was that it had a lift in it, which is an interesting asset from a game design perspective. We've got something that we've designed as a fixed solid in Revit, but we want to make it move and we want to have some interaction with, some buttons.

I'm going to let Toby give you a demo about what we did with Stingray now. This isn't using the VR. It's just with a simple Xbox controller. But you'll start to see some of the key concepts that we'll build into later in the presentation.

TOBY FOOT: Can I?

TOM HUGHES: Yeah.

TOBY FOOT: So for this experience, we used an Xbox controller. I think it's not open. Oh, no, there it is. Sorry.

TOM HUGHES: First live demo, going well.

TOBY FOOT: Yeah. So I just light it up here?

TOM HUGHES: Yeah.

TOBY FOOT: Nearly there. Is that coming across all right on the screen? Oh, the PowerPoint's still running.

TOM HUGHES: Yeah.

TOBY FOOT: So.

TOM HUGHES: OK.

TOBY FOOT: Huh. Are we there now?

TOM HUGHES: No, we've still got the PowerPoint slide on the screen.

TOBY FOOT: That's weird.

TOM HUGHES: Is it open twice?

TOBY FOOT: Oh, yeah. And we're back to the desktop now. Good. Sorry about that. Cool.

TOM HUGHES: OK, success.

TOBY FOOT: So here we go. I'll maximize that so you can see it better. So this is our station model. And you can see we've added some context that was just a basic InfraWorks model.

And I can change between different modes, which you can see down at the bottom left. So I'm now in the fly mode. So you can fly down, fly around wherever I want to go.

And here's the lift, the access-for-all lift. So I'll pop into first person mode. And you can see our guy walking here. Approach the lift.

And now we've got this toggle here where you can change your height of the player, and at low height, it changes into wheelchair user. And so we can-- I'll zoom in a bit there into the lift. And you can see the graphical fidelity, and Stingray is really good. Nice reflections and bump maps on the materials.

Wait for the lift. And you can see that the lights even work on the lift. There we go. We've made it to the platform.

TOM HUGHES: And we've made it through our first live demo.

TOBY FOOT: Yeah.

TOM HUGHES: Thanks, Toby.

TOBY FOOT: OK.

TOM HUGHES: I will just change the clicker as well.

TOBY FOOT: Oh, yeah.

TOM HUGHES: Back where we started. Great.

TOBY FOOT: Good.

TOM HUGHES: Do you want to talk about how we built the lift?

TOBY FOOT: Oh, yeah.

TOM HUGHES: Yeah.

TOBY FOOT: So as you saw in that demo there, we had an animated lift. So this scenario with the train station, the access-for-all scheme gave us a good opportunity to try out some animated assets. So we created the lift in 3ds Max, and created the animations there as well.

And actually, it was all one long animation. And then we can chop it up when we bring it into Stingray, which is what you see down the bottom there. And then we create the logic for how the lift works, which is what you can see here. So that's like, lift goes up, open doors, pause. And how it works with calling the lift when it's at the bottom or top, all that logic.

And there's also-- you can't see it, but there's a trigger box in front of the door. So when you approach, it'll detect that you're there, like a real lift has sensors. And you can see that we even had working lights included in the Max model there.

So I'm not a coder at all. I don't understand any of that side of things. But that doesn't mean I can't code in Stingray-- 3ds Max Interactive-- because it uses a visual language called Flow which is based on LUA behind the scenes.

But this is how it looks with-- you've got nodes, and you string them all together to create the logic you want. So get the functionality. So this is the flow nodes that control the lift. And you can see, we've got boxes around groups of nodes.

And that doesn't actually affect how it works, how it functions. It's just really good practice when you're doing these Flow nodes to remind yourself which bit does what. So otherwise, when it gets complex, there will be hundreds of these things. It's a big-- lot of spaghetti.

TOM HUGHES: So I want to have a really quick look at our basic user interface within our early Stingray projects. So to begin with, we obviously needed to include some of our Mott MacDonald branding. And you can see that from 2015 to 2017, we've actually changed our brand logo.

And we've obviously got some product information. We've got, effectively, our player selection. We've got details about control. So not only can you use the Stingray model with the Xbox controller, you can also use basic WASD interface on a keyboard.

Then we've got the locomotion mode, so the fly-through, the walking grade. And we also introduced a bit of functionality to allow you to time how long it takes to get from curb to platform to give a better idea of how long it would take wheelchair users to get around the station. So thinking about some of the design challenges, and introducing those into our game's UI.

This is really early work, so you kinda look back now two years later and go, ugh. Looks ugly. But at the time, we were working quite hard, and I think we did a reasonably good job.

So what about the outcomes of that very first engagement working with Lionel and Autodesk Consulting? Well, eagle-eyed people among you will notice that the rendered image that we had in the original slide that we repeated here is a slightly different configuration than the game that Toby was playing. The model that we put in our live workflow is depicted by the schematic on top, and we had a 90-degree entrance and exit with the doors.

One of the insights of working in live and experiencing the design as a wheelchair user was how difficult it was to negotiate that 90-degree turn inside the confines of the elevator. Working with designers, that allowed the lift configuration to be changed to 180 through and through, which meant extending the canopy but giving a much better experience for the wheelchair users we're trying to help.

So in terms of those first steps, what were the lessons that we learned? Well, that Max Interactive extends the potential of BIM models. I think, undoubtedly, we can see that's true.

It's great for communicating design intent to people who might not have ever used BIM before. We got great feedback from our client. We learned a new set of skills. We had reusable template, so when we switch out the Revit model behind that, we can use the same functionality in any of our models. We also started to get to grips with things like user interface, and how we can put company identity into new types of products.

So then what came next? Well, we were challenged with producing six experiences that would showcase VR technology at our 2017 company AGM. As I mentioned when I explained about Mott MacDonald, we're an employer and company, so our AGM's actually all of our most senior leaders within the business.

For someone like Toby, the opportunities to do six models in a very short period of time was really exciting. For someone like me, it was really worrying. I knew that if we didn't get it right, we were going to look like idiots, which is why we engaged with Lionel again, and have worked with Autodesk consultancy to produce the experiences.

We were also battling the fact that during 2017, the hype around VR was perhaps as big as it had ever been. Everyone wanted to see everything happening all at once. And we really had to focus on what was most effective.

So thinking about that, what sort of challenges are we going to try and deal with? We also needed simplicity. So while there was many, many challenges on the project, what we need for the user experience for our board of directors was a simple, easy-to-use solution.

But we didn't want to have something that was a one-off. We wanted something that we could build on and use in the future. So there were some key issues there.

So one of the biggest challenges early on-- although we'd been asked to produce these models for the AGM-- is actually getting buy in of the importance of VR at senior level. So it can be perceived as being expensive, time-consuming, and that we need key skilled expertise. By working with Autodesk, we were able to do, we think, a very good job of producing high-quality VR experiences without having to hire anyone else, without taking too much time, and not at high cost when you consider the outputs that we've had.

One of the things we work with Lionel is focusing on the key areas, and what differentiated our approach to virtual reality. So we can see here across this use case mapping with external and internal use-case, we've got some key areas of focus.

So the first area that we looked at with Lionel was we wanted something that was showcasing the technology to important stakeholders with a lot of business. The medium-term use case benefit from opportunities to use VR within marketing at a industry event. And then longer-term, we were really thinking about collaboration, and how we could use VR to connect teams across the globe.

So our approach to buy in was prioritize the areas of highest value, was define success, and was to identify suitable project data that we could use quickly and efficiently. By doing that, we were able to pilot quite quickly, and apply lessons learned continuously. So Toby and Lionel worked very closely to iterate their solutions. After we'd done the buy in part, we had to think about some of the things that Toby really loves-- the cool VR and the hardware.

TOBY FOOT: Mm-hm. So we had a choice when we started out on this journey between two headsets-- yeah, you can go. So let's talk about the hardware first.

So the hardware that we recommended. Most of that doesn't really matter too much. Any modern PC has everything you need except for the graphics card. So that's the most important part to consider.

So a 1080 we found to be ideal. It'll handle everything we throw at it with ease. Obviously, anything above that would be even better, but they rise in cost quite quickly. And 1060 and 1070s are OK. It was a 1070 running on here, and you'll see it's nice and smooth. But 1050 and below is right out for VR. OK.

So yeah, we had a choice between two headsets when we started out. And we purchased both of them, the Oculus, and the HTC VIVE. But at the time, the Oculus didn't have room scale, like you can see here, and didn't have the motion controllers. If any of you are not familiar with room scale, it means that I can walk around this space and it'll track the headset fully in that space, rather than just having to sit there in front of one sensor.

And another reason for picking the VIVE is the precision of the tracking. The controllers here contract to submillimeter accuracy, as these use a laser grid shining down to try to control this. They're incredibly accurate, which is ideal for engineering projects. And it was so successful, from our one VIVE, we now have 13 across the company. And it's been recognized as the standard platform that we'll use going forward for VR, at least for this first iteration.

So as you can see here with the Alienware laptop, we also wanted to be able to take the setup around. We've already used it at trade shows, at conferences like today. And so we found the Alienware 15 that has the GTX 1070 is absolutely perfect for performance. It's a little bit big to lug around, but it's got the oomph.

So we also have, as well as the mobile installations and the PCs people might develop on at their desks, like I showed you earlier, we've started to set up permanent installations in offices. So the intent of these is that visitors coming and sitting in the waiting area can just pick up the headset, try VR, see some of our models, get a good impression of the company.

And the intent there is that this isn't a guided experience. People coming in can just pick it up, put it on. And the interface is intuitive, so they immediately know how to navigate, and they don't need anyone to tell them, oh, you need to push this button, do that. OK.

So I'm going to talk to you a bit about our workflows. So a key bit of information is that us-- and I'm sure most of you-- already have a lot of excellent VR-ready content, particularly focusing on InfraWorks and Revit with Revit Live plug-in.

So there's not a lot you normally have to do, especially with Revit Live, to make your models ready for VR. But if you've got a massive modeling Revit, just simply hiding of the elements, chopping out data you don't need will really improve the performance of your model. You got any other tips for improving performance?

LIONEL GRAF: Yeah. So what you need to know, basically, is from a technical perspective, everything needs to live inside the graphic card. So you will find, probably, three main things you need to look into for improving your VR performance.

Number of object and complexity will affect the performance. Size of the texture will be something that can be a bottleneck as well. And everything that will need to be calculated by the graphic card at runtime for each frame will be as well be a challenge.

So the things like real-time shadows, particle effects, all of that will-- it's cool to have that kind of thing in design, but it costs a lot photographical. So basically, for the sign complexity, as Toby said, you can choose the right data to display. Most of the time, you don't need to have everything.

For a particular VR experience, you will probably need to take some decision. And those decisions will be helped by some object, not all the objects. So choosing the right level of complexity is a key thing.

And I would say, be aware of the weight of the object you put in VR. You can have a very complex chair, for example. If you have 1,000 of those in your scene, you can imagine that it will affect the performance.

So be aware and conscious of what you put in VR. The same as for texture. For example, you may have one texture with a very big image, very high resolution. If you don't see it, if it don't has value for the experience, prefer to reduce the size, for example.

TOM HUGHES: Thanks. So this section is about workflow, so let's look at some of the workflows we've used to take this kind of content and produce the kind of experiences we'll demo later.

TOBY FOOT: So first of all, I'll talk you through the workflow for InfraWorks models. And the best way to get your data out of InfraWorks is to use the built-in FBX Exporter. And if you're happy with your model, you can just push it straight through. It's a 3ds Max Interactive.

But if you want to add some polish and make it a better experience, you can push it into 3ds Max where you can maybe upgrade the materials, or add in some more assets-- maybe like cars, people-- and maybe upgrade those assets too to better-looking and better-suited models for VR.

And if you've got that data in 3ds Max, then it's a really simple post-process to push it straight into 3ds Max Interactive. There's a built-in tool, and it'll even live link your view, so whatever object you're looking at in 3ds Max, you'll see the same one in 3ds Max Interactive, and see how it looks in the game engine. It's a really neat feature. OK.

So I'll talk to you next about the Revit workflow. And this is an even easier task than the InfraWorks workflow thanks to the Revit Live plug-in. I'll show you a live demonstration-- hopefully it'll work-- of how that functions.

But basically, in the add-ons on Revit, there's a button that says Go Live. You click that. It sends you a model up to the cloud. And with a short amount of processing time-- and I've been speaking with some of the guys who work on Revit Live over the past few days. And I think they've brought the time down by about 10 times, so it's a really quick process now.

Once it's back from the cloud, download the file down to your PC. And then, as a nice intermediary step, you can open up that model in the live viewer and have a little play with it, see it in VR, get a preview before you actually take it to 3ds Max Interactive to create your more curated experience.

So live does all the heavy lifting. It's just working in the cloud in the background. It doesn't tie up the resources of your PC. And a really neat feature is, it's not actually taking the geometry from Revit like you'd have with InfraWorks or any other source. It's replacing those assets with assets that Autodesk have made that are better suited to VR.

So an easy one to spot is the trees in Revit, they're quite plain-looking, quite low res texture. It swaps these out for animated trees where the leaves blow in the wind and stuff. And also, those assets have level of detail.

So if you don't know what is, there's several different models. And when a tree, for instance, is far away, it'll have far less polygons. And as you approach it, the polygon count will go up. You don't see this-- it's like a seamless translation-- but it really reduces the load, as Lionel was saying, on the graphics card.

And then the final step is, you copy the model that's been out pushed to live and open it straight up into 3ds Max Interactive. And then you'll have that same data set that you had in the live editor straight up into Stingray ready for developing your experience. And so you can combine these two workflows, like you saw in the demo with the rail project earlier, to mix your InfraWorks models, your Revit models, and any other sources that you might want to grab an FBX from.

And so we've created a branded VR template that's based on the Max Interactive HTC VIVE template. So when you first load up Max Interactive, you've got a number of templates. Quite a lot of them are gaming-related, so maybe a first-person shooter or something. But there's an HTC VIVE one, and that has all the functionality you need to get straight into VR.

And so we've added some extra features on top of what you get in the vanilla Revit Live experience, the main one being the Mott MacDonald-branded user interface. And we have this, which you might have seen earlier, the meeting room. For this demonstration, we've branded it with Autodesk branding for the occasion, but it would normally be a Mott MacDonald-branded environment.

And that's really nice, because when the user first jumps in, that's the first thing they see. And they're in a familiar surrounding. They're not just jump straight into somewhere like at the top of a skyscraper or something. So you get an impression of the model before you can actually jump into it.

And so you've got a table with an architect-style miniature model there. And we have little map pins like you might see on Google Maps that allow you to jump into the model. So you can have a look around, go, oh, I want to see that bit. I'll jump in there.

And then we've also added information flags, which you can toggle on and off. And you can use these to highlight key parts of your project and draw people's attention to those. And we've also added a markup tool, and this is in early development at the moment. But the idea is that, when we've got collaboration particularly, you can meet with other people, discuss the model, and tag elements that you might want to discuss later.

TOM HUGHES: So this is the combined workflow, but we thought we'd like to show you a live demo within Revit of how easy it is to push model data up to Revit Live. So Toby's just going to start switching over so we can bring Revit up on screen and show you the user interface, and then show you something opening up.

TOBY FOOT: OK. Yeah.

AUDIENCE: Isn't it faster to limit how big of a model you can push?

TOM HUGHES: That's a very good question. We're coming to that later, but yeah.

TOBY FOOT: So here's our Revit model. And here's the Go Live button. I shouldn't have just tried to move that. So you can see here, as I said earlier, that the trees aren't very complex at all. They're just two images like this.

So you hit Go Live-- whoa, whoa, whoa. This is not liking this table. And the nice feature is, it tells you here whether your model's ready, so it'll look a number of things. Key things are like, if you've got any section planes-- also section box switched on, or you've got the wrong visual style switched on, things like that-- it'll just give you a nice flag telling you to change those things about your model. It'll also tell you if there's missing textures, things like that.

And then I recommend that you expand Advanced Options and untick this Extend terrain to horizon. So what this will do is it'll take the edge of your terrain here and extend its horizon on out to mountains and stuff. It is quite a nice look for the model, but for what you're going to be doing with it, taking it into 3ds Max Interactive later, it's data you don't need.

So you'd click go, and I won't actually wait. Here's one I made earlier. But you can see it's starting to upload there. Not going to rely on the Wi-Fi here.

And then what you get out of that is this. And you can see, as I said, the detail of the trees there is really improved. And compared to Revit, the graphical fidelity is excellent.

And you can see down in the bottom, there's your VR button. And this is where you can just jump in and have that quick preview. There's lots of nice little features in here, like you can change time of day. It's quite nice.

And it takes in your viewpoints that you would have in Revit, your camera points. And the BIM data's in there as well. Yeah. That's probably enough of that.

TOM HUGHES: Cool.

AUDIENCE: [INAUDIBLE]

TOBY FOOT: Yes. Yes. So if you do limit the size of your model, that is what will get processed.

AUDIENCE: [INAUDIBLE]

TOBY FOOT: Exactly, yeah.

AUDIENCE: [INAUDIBLE]

TOBY FOOT: Yeah, that's a key technique that we used with really large projects to make the performance better.

TOM HUGHES: [INAUDIBLE]

TOBY FOOT: Oh, sorry.

TOM HUGHES: That's OK. Unfortunately, the Alienware laptops, while they're really powerful and have a great graphics card, do not have many USB sockets. So we're having to switch between the mouse and the clicker.

So a question earlier about the size of models. So we were using this on the early days of Expo. And then as we moved into Revit Live, and particularly for the AGM projects, one of the things that we found basically producing models from other people's projects is, we can't always control the models that we get.

As an AEC business like ours, we design some really big things. So stadia, airports, hospitals tend to be fairly big. But these big things have a tendency to cause Revit Live some problems, or at least when we first started.

Actually, we're known by Nicolas Fonta, who's the product manager for Live, because we broke his server by uploading 8 gigabytes of data three times in one day. So that's the fairly intimidating way. So, sorry. Sorry, the camera caught me off-guard there. Where are we?

TOBY FOOT: You're still-- back one.

TOM HUGHES: Back one more. Oh, yeah. Cool, OK. So you've just seen the solution. I'll just go back to the model.

So yeah. Basically, we have one of our projects. We're not going to have time to show you that project as a live demo, but we did a stadium. And that stadium was 8 gigabytes of data that we were trying to get to live.

At that time, we just couldn't get it to work. We broke Nick's server. He didn't get annoyed at us, but was really interested about what we were doing, because he needs to know what kind of models are going in. And we're not the typical architects' models that are the size of a house. We're the size of a stadium, so there we go.

We have got a simple workaround solution, so where we do get users with large models and we need to find a way through that. It's not rocket science. We can either subdivide the model and upload that as two or three pieces to Revit Live, or we can actually go direct into 3ds Max and then into Max Interactive that way.

The drawback, perhaps, of that route is you don't get the processing of Revit Live. So the kind of function Toby talked about in terms of the improvements that Revit Live is doing automatically, you lose out on that a little bit. But really, if you're thinking about dealing with a large stadium, you can maybe think that you bulk upload the big piece, and then think about the experience for a user, and do some small submodels for those parts of the level design.

TOBY FOOT: So we often get IFC files from architects normally. And a problem with IFCs in Revit is they just come through with no materials, just a big gray mass. And this isn't what we want to be presenting to our clients in the Max experience. So--

TOM HUGHES: There they are. How do we solve the IFC problem?

LIONEL GRAF: What we can say is that one of the workaround we can put in place to overcome those kind of challenges is to, first, to bind your IFC link file into your Revit. So the thing is that, Live will not be able to leverage all the information that are contained in the IFC if it's only a link file somewhere. So it's an element that is added to your Revit project. But if you bind this into your Revit project, Live would get access to all the metadata that are contained in the IFC file.

The trickiest thing is the materials. So right now, we don't really have a solution to deal with the materials. Again, if this IFC file is binding to the Revit model, you will have access to the graphic definition or properties of the object. But Live will need to leverage what is in the appearance tab in Revit. Basically, what you will get if you render the Revits using the cloud, for example.

And this is not something that come through with the IFC. So for materials, for the moment, you will need to leverage something like 3ds Max, or 3ds Max Interactive directly to tweak those materials afterwards when you get those data into your VR environment.

TOM HUGHES: So I believe the model that we all live demoing today is one of the models that a large portion of the content in there came to us via IFC. So you get to see how we've worked with IFC, and how we're delivering VR from an IFC base.

TOBY FOOT: Yeah. The concrete is Revit, and then I think everything-- oh, and steelwork. Everything else is IFC.

Right. So a really important factor to consider when you're making a VR experience is locomotion, i.e., how you're going to get around. And also performance. A good locomotion experience is absolutely critical in VR.

And bad locomotion can have really bad side effects, make people nauseous, and put them off VR. And it's something I get a lot when I'm trying to demo VR to people. They'll say, oh, no, I don't want to try it. I had a bad experience before.

So we tried several techniques for locomotion. And one of the main tips we have is-- even though you'll actually see it in games-- it's not a nice experience to have rotation that's not controlled by the headset. If I'm looking over here, you don't want to be able to push a button and then make my view change without me moving my head, because it really disrupts your sense of direction.

So there's a few methods that we recommend for being best practice. Teleportation is the main one. And this is something that's used, again, in a lot of games, where-- you'll see it later-- you push a button on the controller, you'll get an arc coming out of the end of your controller like you're throwing a ball. And where that arc ends is where you'll jump to. It's really intuitive. As soon as someone presses that button, it's pretty obvious what's going to happen.

The next one is what I talked about earlier with the map pins. So points that you want to highlight, points you want to say, this is a good place, this is a nice view, this is where you'll see the part of the model we want you to see. So place those points at nice points of interest around your model.

And then, so you can leverage that two ways. You can either have it so you're jumping around when you're actually in the model, or like I was talking about earlier when the models on the table, use it to jump from the scale-- the way you're looking over the model-- to the human scale when you're inside the model.

And then the last type of locomotion-- which, to be honest, is probably used less-- is using some sort of vehicle. So you might be driving in a car. You might be on a Segway or something. Something that makes the user feel that the locomotion's OK. They're not just being dragged along. There's something to ground their sense of movement.

Things that we really don't recommend is free movement. So like you'd have in a normal first-person shooter, pushing a direction to move without actually the player moving. And flying. So taking off, flying out around the model in first person can be a really nauseating experience.

And of course, actually, I forgot to do one form of locomotion that's probably the most obvious. Walking around is the absolute best way to move around in your model. You're obviously limited. Here, we've only got a 2-by-2 meter space for the room scale. What's that, about 6 feet or something.

But it can go up to 5-by-5 meters, and that's actually a pretty decent space. And you can really play with that a lot. We made that Mott MacDonald-branded room the same size, 5 by 5 meters. So when you've got that maximum play space, you never need to teleport. You can just walk around absolutely freely. OK.

TOM HUGHES: And we're doing a really quick demo, aren't we, on some of the locomotion just so you can see. It's often hard to conceptualize some of those things just looking at a screen, talking about them. It's easy to start to see--

TOBY FOOT: So you're going to narrate with me.

TOM HUGHES: Yeah. We good to go?

TOBY FOOT: Should be. Can you hold the mic?

TOM HUGHES: Yeah, of course. It's the bit you've all been waiting for, the live demo.

TOBY FOOT: Well, I hope I've set the boundaries up right. Otherwise, I'm going to be walking into stuff.

TOM HUGHES: Cool.

TOBY FOOT: The other one way around. OK. So here, you can see the branded room, as we talked about, and the model that's on the table. So I can show you a bit about our UI here.

TOM HUGHES: Actually, I think we cover that later.

TOBY FOOT: We do. But I need to use it to access the viewpoints. So this is the POI, the Point of Interest viewpoints I was talking about. So you have a laser on your controller. You select the one you want to go to. There we go. And yeah, you'll jump in.

And so the next point of locomotion emotional you is that teleportation. So here you go. Can you see the arc? And you might be able to see the white square as well.

And if I walk around, you'll see that I'm walking around in that square. So that shows you your actual room that you're standing in. So if I stick that there, I'll be walking out over the edge. So there we go.

And you've also got the map pins like you did in the table scale to jump around. And now I'm outside. OK.

TOM HUGHES: Really cool. Yeah, thanks. We'll come back to the model in a little bit later. We'll show you all the functions that we've been building with Lionel once we've gone through some of the challenges that we faced early on with our approach.

TOBY FOOT: OK. So as I mentioned earlier, another factor that's really important to get experience is performance. It can really affect users in the same way that bad locomotion can. Make you nauseous, make for a bad impression.

So the recommended minimum frame rate that we want to hit is 90 frames per second. And that's the refresh rate of the VIVE, so matching that will give you a nice, smooth experience. It was like Lionel talked about earlier, there's a number of ways that you can get a good frame rate.

And 3ds Max Interactive has a lot of these functions built in. One would be baking lights. And if you don't know about game development, basically, baking lights is-- you'll see here where the lights shine on the wall.

If that was a real-time light, then you'd get shadows and it'll be actually calculating the photons in real time. But if we're not going to actually be able to cast a shadow there, we can just create that texture with the lights already shown on the texture. And that's what baking the lights is.

And that really reduces the performance hit. Especially in a large model like that stadium model, we had thousands and thousands of lights in that model. And there was just absolutely no way even out of VR that you could get the kind of performance you need.

And like we talked about earlier, geometry is also a big factor. So like Lionel said, in that stadium model, we had thousands and thousands of seats. And if that's a high-polygon model, you're just creating massive amounts of complexity.

And often, there's no real need. You can get away with something much less. And things that are hidden, like ductwork, it's easy just to switch those off. OK.

TOM HUGHES: Cool. So one of the things we wanted to talk about was brand identity. Clearly, when we looked at the user interface of our first Stingray project, we'd got a fixed UI, so that Mott MacDonald brand was evident all over the place.

I suppose you have to go back and ask why brand identity is important. And for me, it's how your customer knows-- or your customer, and also your customer's customer knows who you are. There's a few ways to think about brand identity within VR.

But we trialed some early approaches with perhaps a floating UI where we've got Mott MacDonald visible within the model, or putting Mott MacDonald-branded assets within the client's model space. We felt it was a little bit of an intrusion into the client's experience.

So we worked with Mott to come up with some concepts of how we could do this differently. And you can see the Mott MacDonald space there, and like Toby said today, for today's UI, we've branded that with Autodesk space. Clearly, when we're working in joint venture, or we're working for a client specifically, because this is just a VR space, we can just change out the identity within this model of a concept of a room-scale model with a table on top.

You start your experience in that room, so you know who you're working with, what the project context is. You also can use that room as an opportunity to present key project facts on a wall. So there's lots of things you can do in that room space before you then enter the model, actually experience things at human scale.

So we've used this. And we're finding that this is a really, really good way of thinking about setting the scene of your experience before being in your experience, which is a really important point when it comes to user experience within virtual reality.

AUDIENCE: When did you set that room experience in each model? When was this--

TOM HUGHES: So that's its own model within the level design. So we start the experience there, and we allow the users to then jump into a new level to start their experience at human scale.

AUDIENCE: Thank you.

TOM HUGHES: So Lionel's going to give a little bit of an introduction in terms of how we approach the actual in-model user interface, and how we dealt with some of the challenges around that.

LIONEL GRAF: So as Tom said, one of the key task in this AGM project was to showcase a brand. And this VR room with the branded [INAUDIBLE] on the wall is a good introduction. But as soon as you jump into the human scale level, you are hiding all the stuff.

So the VR menu-- so we developed a kind of UI that is attached to the controller. So you saw briefly that with Toby before. And basically, you are holding all the function with you. So as soon as you press a button, it will show up a menu that gives you access to some functions. So it can be basic function, like toggling the viewpoints, or things more complicated like taking measurements or--

TOM HUGHES: So when we do the live demo, we hope we get to show all of the functions that we built in our [INAUDIBLE].

LIONEL GRAF: But it was something quite interesting to have, or to carry the Mott MacDonald brand as well.

TOM HUGHES: So we have one more slide before we're going to do our full VR demo. So Toby's just going to talk for a second about the human scale experience. So that's when you are within the model at the full scale of the asset.

TOBY FOOT: So as Tom was saying, the human scale level, it's about actually experiencing the project, and experiencing it from a first person, and trying to really give the user as much of an authentic experience of what it's like to be in that place as possible. So we don't want any branding in that space. We want it to be pure, just seeing what the actual Revit model is, or whatever model. Because it's about the client's project in this space. It's not about us.

And so a nice feature that we had in here is you saw those pins. When you're placing them on the table, they automatically link to the same locations in the human scale. So that's just a little bit of that Flow code to be able to link those together. So you've got a nice, seamless experience. You can jump around to the same points, and you don't have to try and manually line them up for consistency between the two.

TOM HUGHES: OK. So we're going to try to show you as many of these experiences as we can. Obviously, it's taking a bit of time in terms of changing the model and things, so we're running short-term. But Toby's going to quickly jump into VR, and we're going to start to see what we've built with Lionel over the last, I suppose, eight or so months. Not continuous working, but as we've identified need to do different features within different models.

One of the things that's really key to understand about these features is, although we're building them into a specific experience because it's a specific requirement, we're actually building a library of functions. So when we start a new project with a new client, we know that you've got these functions. We can bring those in because they're totally reusable bits of code within the Flow diagrams. We just bring that in.

So it's not every time we do it, we're building these things from scratch. It's like modules of just, here's a functionality that sits within the template. That was weird.

[CHUCKLING]

So here we are in our model. Toby's currently in the room scale, and just able to look around. One of the first things you'll do is load up the UI and then look to place the information flags on the table.

So you can see the UI there, selecting a flag, and you can place a flag on the model and see how it's tracking Toby as he walks around. We can populate these flags with bits of project information, preset them so that you can understand what you're looking at in that architectural table space mode.

The next thing you're likely to do is probably want to actually see things a little bit closer. So you can go from this room scale and jump into human scale. There we go. So you're now inside the model.

Here, you can see the teleportation, just jumping around. This allows you to move forward, so you're still within the room scale of the VIVE environment, but you can then walk around and then jump to the next place.

We found that of method of locomotion was the best experience for the user. We found it was less likely to cause any sort of nausea. But you can do some pretty big jumps. So yeah, you can still see you can get quite far there.

OK, so yeah. Let's have a look at the room data. So one of the things about working with live is we're bringing data through from Revit. And you can see in the UI that we've got access to some of the parameters that are contained within the optics.

We do some work to identify what we prioritize in terms of the display. We couldn't show all the parameters, because clearly, that would overwhelm you. Cool. It's nice. It's working well. We got through there.

So this is something that Lionel's been working on with us most recently. And this is changing materials. It's really just textures, so we're not really changing anything in the Revit model. But it allows you to walk through live with a client and look at what different finishes might look like just by painting your model in virtual reality.

TOBY FOOT: You can make it out of glass.

TOM HUGHES: Oh, wow. So I'd not seen that one before. That's really cool. One of the things with this, at the moment, this isn't a round-trip workflow, so we're not pushing any of this data back. But clearly, as we start to mature our functions, that is something that we've already started to have discussions around. So you can see what this might look like if you were to see it again this time next year.

What about dimensioning? So we're not talking about detail design reviews, but you might want to check a few key dimensions. Maybe something doesn't quite look right. Well, here's point-to-point dimensions live within a VR environment. You see how quick that is? You can do anywhere in the model.

TOBY FOOT: And this is orthographic. So it'll just send a beam straight out and do 90-degree--

TOM HUGHES: Oh, yeah, of course. Yeah, yeah, yeah.

TOBY FOOT: And then if you want to delete them, you can just click on it.

TOM HUGHES: So one of the things that we've been thinking about use cases for something like this could be perhaps doing inspections, but virtual inspections to measure out dimensions. What are we going to go to now?

TOBY FOOT: Issue.

TOM HUGHES: Oh, of course. So this is, yeah, an ability to just spawn objects. So we were thinking about spawning. We've seen quite a few people spawning chairs, tables, that kind of thing. But we were thinking, we like to do design reviews where we want to identify what might be a potential risk, potential hazard.

We've been talking again about opportunities that we could perhaps enhance that. So not only just a marker but perhaps take some text with some voice-to-text and start tagging some detail there. So I'm thinking about some of those in the future too.

TOBY FOOT: And you'll see that they scale, so.

TOM HUGHES: Oh, yeah, of course.

TOBY FOOT: It's quite big there. But as I go up, it scales down, so it's not intrusive.

TOM HUGHES: So there are some of the things we've had to think about in terms of working with VR and making the user experience responsive. And it takes a little bit of thought, but actually, all of the coding for that's within those Flow nodes. So it's not us having to code. It's us just thinking of how we use those nodes and get the right kind of experience.

TOBY FOOT: And then last but not least, if we push the Home button.

TOM HUGHES: Oh, yeah, of course. How do I get back out? Oh, yeah, I push Home and I'm back in my branded room. So I think that's our live demo of what we've done. I hope it's-- yeah.

[APPLAUSE]

Very kind, thank you. So I'll just let Toby switch back to the last few slides in our presentation.

TOBY FOOT: How we doing?

TOM HUGHES: Yeah, I'll just skip through the last-- so the project template is basically, once we've got this template, we can store this. We can use it on any project we want. These are the bits that build the template, but it's relatively straightforward.

So the very last bit, what's our successes. So really, this was a story about the approach for the AGM. Really, success for us, we hit the deadline for the AGM. We had six of these models available, so that was for all of our global sectors. Transport models, building models, water resource models. That's really down to the hard work of Toby.

And then, also, that's a thank you for me for making that happen. We wowed our stakeholders. And we've engaged with countless project leaders in that space, so they came to us with our own needs for VR experiences, their own ideas about functions we could build in the future. So having this visible what we can do has then allowed us to really think about what we can do next.

I guess the other successes for us has been-- we've launched within Mott MacDonald now a VR portal where anyone within Mott MacDonald can come to and find information about VR. And as Toby mentioned earlier, we've gone from having zero to, within the space of four months, 13 HTC VIVE headsets where offices are seeing the opportunity purchasing now-- 14. Have we got another one? Some of our colleagues here in the front row.

But that's not just-- so we've got places like Hong Kong and Singapore. We've also got Toby traveled out to one of our offices in India to look at how VR could be used for collaboration on design between design centers and designers either in the UK or in the US. So that's one of the things we're thinking is a real big opportunity for us.

And yeah, I guess that last success is the opportunity of events like this. But also things like the National Infrastructure Forum in the UK. We had a similar sort of setup and we were presenting in the exhibition hall there. So because we can demonstrate that we can do this, we've got the opportunity to show this to a lot of people now, which is really, really good.

So we wanted to just recap the tips and tricks, because we know that you've seen this, and you really want to try this now. So we're keen that you come away from here enthusiastic, but also with the right knowledge to actually do this yourself.

TOBY FOOT: We're pretty much out of time, so I think probably what would be best is you can read these in more detail afterwards. But just the key one is reiterate the frame rate is absolutely the number one thing you should consider when you're creating your experience.

TOM HUGHES: And I think we have discussed most of the things that we are really thinking about about how we move this forward for next time. And I think that's something I'd just like to remind you. So I know everyone that's here will get a reminder to rate the class. And if you want to see us here next year talking about what we're doing in the next 12 months, just make sure you rate the class, because that's how we get back here next year, so we'd really appreciate that.

All right. That's it, yeah. So thank you very much.

[APPLAUSE]

______
icon-svg-close-thick

쿠기 기본 설정

오토데스크는 고객의 개인 정보와 최상의 경험을 중요시합니다. 오토데스크는 정보를 사용자화하고 응용프로그램을 만들기 위해 고객의 본 사이트 사용에 관한 데이터를 수집합니다.

오토데스크에서 고객의 데이터를 수집하고 사용하도록 허용하시겠습니까?

오토데스크에서 사용하는타사 서비스개인정보 처리방침 정책을 자세히 알아보십시오.

반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

이 쿠키는 오토데스크에서 사용자 기본 설정 또는 로그인 정보를 저장하거나, 사용자 요청에 응답하거나, 장바구니의 품목을 처리하기 위해 필요합니다.

사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

이 쿠키는 오토데스크가 보다 향상된 기능을 제공하고 사용자에게 맞는 정보를 제공할 수 있게 해 줍니다. 사용자에게 맞는 정보 및 환경을 제공하기 위해 오토데스크 또는 서비스를 제공하는 협력업체에서 이 쿠키를 설정할 수 있습니다. 이 쿠키를 허용하지 않을 경우 이러한 서비스 중 일부 또는 전체를 이용하지 못하게 될 수 있습니다.

광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

이 쿠키는 사용자와 관련성이 높은 광고를 표시하고 그 효과를 추적하기 위해 사용자 활동 및 관심 사항에 대한 데이터를 수집합니다. 이렇게 데이터를 수집함으로써 사용자의 관심 사항에 더 적합한 광고를 표시할 수 있습니다. 이 쿠키를 허용하지 않을 경우 관심 분야에 해당되지 않는 광고가 표시될 수 있습니다.

icon-svg-close-thick

타사 서비스

각 범주에서 오토데스크가 사용하는 타사 서비스와 온라인에서 고객으로부터 수집하는 데이터를 사용하는 방식에 대해 자세히 알아보십시오.

icon-svg-hide-thick

icon-svg-show-thick

반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

Qualtrics
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Qualtrics를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Qualtrics 개인정보취급방침
Akamai mPulse
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Akamai mPulse를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Akamai mPulse 개인정보취급방침
Digital River
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Digital River를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Digital River 개인정보취급방침
Dynatrace
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Dynatrace를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Dynatrace 개인정보취급방침
Khoros
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Khoros를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Khoros 개인정보취급방침
Launch Darkly
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Launch Darkly를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Launch Darkly 개인정보취급방침
New Relic
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 New Relic를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. New Relic 개인정보취급방침
Salesforce Live Agent
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Salesforce Live Agent를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Salesforce Live Agent 개인정보취급방침
Wistia
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Wistia를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Wistia 개인정보취급방침
Tealium
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Tealium를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Upsellit
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Upsellit를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. CJ Affiliates
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 CJ Affiliates를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Commission Factory
Typepad Stats
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Typepad Stats를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Typepad Stats 개인정보취급방침
Geo Targetly
Autodesk는 Geo Targetly를 사용하여 웹 사이트 방문자를 가장 적합한 웹 페이지로 안내하거나 위치를 기반으로 맞춤형 콘텐츠를 제공합니다. Geo Targetly는 웹 사이트 방문자의 IP 주소를 사용하여 방문자 장치의 대략적인 위치를 파악합니다. 이렇게 하면 방문자가 (대부분의 경우) 현지 언어로 된 콘텐츠를 볼 수 있습니다.Geo Targetly 개인정보취급방침
SpeedCurve
Autodesk에서는 SpeedCurve를 사용하여 웹 페이지 로드 시간과 이미지, 스크립트, 텍스트 등의 후속 요소 응답성을 측정하여 웹 사이트 환경의 성능을 모니터링하고 측정합니다. SpeedCurve 개인정보취급방침
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

Google Optimize
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Google Optimize을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Google Optimize 개인정보취급방침
ClickTale
오토데스크는 고객이 사이트에서 겪을 수 있는 어려움을 더 잘 파악하기 위해 ClickTale을 이용합니다. 페이지의 모든 요소를 포함해 고객이 오토데스크 사이트와 상호 작용하는 방식을 이해하기 위해 세션 녹화를 사용합니다. 개인적으로 식별 가능한 정보는 가려지며 수집되지 않습니다. ClickTale 개인정보취급방침
OneSignal
오토데스크는 OneSignal가 지원하는 사이트에 디지털 광고를 배포하기 위해 OneSignal를 이용합니다. 광고는 OneSignal 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 OneSignal에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 OneSignal에 제공하는 데이터를 사용합니다. OneSignal 개인정보취급방침
Optimizely
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Optimizely을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Optimizely 개인정보취급방침
Amplitude
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Amplitude을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Amplitude 개인정보취급방침
Snowplow
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Snowplow를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Snowplow 개인정보취급방침
UserVoice
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 UserVoice를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. UserVoice 개인정보취급방침
Clearbit
Clearbit를 사용하면 실시간 데이터 보강 기능을 통해 고객에게 개인화되고 관련 있는 환경을 제공할 수 있습니다. Autodesk가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. Clearbit 개인정보취급방침
YouTube
YouTube는 사용자가 웹 사이트에 포함된 비디오를 보고 공유할 수 있도록 해주는 비디오 공유 플랫폼입니다. YouTube는 비디오 성능에 대한 시청 지표를 제공합니다. YouTube 개인정보보호 정책

icon-svg-hide-thick

icon-svg-show-thick

광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

Adobe Analytics
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Adobe Analytics를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Adobe Analytics 개인정보취급방침
Google Analytics (Web Analytics)
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Google Analytics (Web Analytics)를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. AdWords
Marketo
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Marketo를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Marketo 개인정보취급방침
Doubleclick
오토데스크는 Doubleclick가 지원하는 사이트에 디지털 광고를 배포하기 위해 Doubleclick를 이용합니다. 광고는 Doubleclick 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Doubleclick에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Doubleclick에 제공하는 데이터를 사용합니다. Doubleclick 개인정보취급방침
HubSpot
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 HubSpot을 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. HubSpot 개인정보취급방침
Twitter
오토데스크는 Twitter가 지원하는 사이트에 디지털 광고를 배포하기 위해 Twitter를 이용합니다. 광고는 Twitter 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Twitter에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Twitter에 제공하는 데이터를 사용합니다. Twitter 개인정보취급방침
Facebook
오토데스크는 Facebook가 지원하는 사이트에 디지털 광고를 배포하기 위해 Facebook를 이용합니다. 광고는 Facebook 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Facebook에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Facebook에 제공하는 데이터를 사용합니다. Facebook 개인정보취급방침
LinkedIn
오토데스크는 LinkedIn가 지원하는 사이트에 디지털 광고를 배포하기 위해 LinkedIn를 이용합니다. 광고는 LinkedIn 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 LinkedIn에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 LinkedIn에 제공하는 데이터를 사용합니다. LinkedIn 개인정보취급방침
Yahoo! Japan
오토데스크는 Yahoo! Japan가 지원하는 사이트에 디지털 광고를 배포하기 위해 Yahoo! Japan를 이용합니다. 광고는 Yahoo! Japan 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Yahoo! Japan에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Yahoo! Japan에 제공하는 데이터를 사용합니다. Yahoo! Japan 개인정보취급방침
Naver
오토데스크는 Naver가 지원하는 사이트에 디지털 광고를 배포하기 위해 Naver를 이용합니다. 광고는 Naver 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Naver에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Naver에 제공하는 데이터를 사용합니다. Naver 개인정보취급방침
Quantcast
오토데스크는 Quantcast가 지원하는 사이트에 디지털 광고를 배포하기 위해 Quantcast를 이용합니다. 광고는 Quantcast 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Quantcast에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Quantcast에 제공하는 데이터를 사용합니다. Quantcast 개인정보취급방침
Call Tracking
오토데스크는 캠페인을 위해 사용자화된 전화번호를 제공하기 위하여 Call Tracking을 이용합니다. 그렇게 하면 고객이 오토데스크 담당자에게 더욱 빠르게 액세스할 수 있으며, 오토데스크의 성과를 더욱 정확하게 평가하는 데 도움이 됩니다. 제공된 전화번호를 기준으로 사이트에서 고객 행동에 관한 데이터를 수집할 수도 있습니다. Call Tracking 개인정보취급방침
Wunderkind
오토데스크는 Wunderkind가 지원하는 사이트에 디지털 광고를 배포하기 위해 Wunderkind를 이용합니다. 광고는 Wunderkind 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Wunderkind에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Wunderkind에 제공하는 데이터를 사용합니다. Wunderkind 개인정보취급방침
ADC Media
오토데스크는 ADC Media가 지원하는 사이트에 디지털 광고를 배포하기 위해 ADC Media를 이용합니다. 광고는 ADC Media 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 ADC Media에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 ADC Media에 제공하는 데이터를 사용합니다. ADC Media 개인정보취급방침
AgrantSEM
오토데스크는 AgrantSEM가 지원하는 사이트에 디지털 광고를 배포하기 위해 AgrantSEM를 이용합니다. 광고는 AgrantSEM 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 AgrantSEM에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 AgrantSEM에 제공하는 데이터를 사용합니다. AgrantSEM 개인정보취급방침
Bidtellect
오토데스크는 Bidtellect가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bidtellect를 이용합니다. 광고는 Bidtellect 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bidtellect에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bidtellect에 제공하는 데이터를 사용합니다. Bidtellect 개인정보취급방침
Bing
오토데스크는 Bing가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bing를 이용합니다. 광고는 Bing 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bing에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bing에 제공하는 데이터를 사용합니다. Bing 개인정보취급방침
G2Crowd
오토데스크는 G2Crowd가 지원하는 사이트에 디지털 광고를 배포하기 위해 G2Crowd를 이용합니다. 광고는 G2Crowd 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 G2Crowd에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 G2Crowd에 제공하는 데이터를 사용합니다. G2Crowd 개인정보취급방침
NMPI Display
오토데스크는 NMPI Display가 지원하는 사이트에 디지털 광고를 배포하기 위해 NMPI Display를 이용합니다. 광고는 NMPI Display 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 NMPI Display에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 NMPI Display에 제공하는 데이터를 사용합니다. NMPI Display 개인정보취급방침
VK
오토데스크는 VK가 지원하는 사이트에 디지털 광고를 배포하기 위해 VK를 이용합니다. 광고는 VK 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 VK에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 VK에 제공하는 데이터를 사용합니다. VK 개인정보취급방침
Adobe Target
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Adobe Target을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Adobe Target 개인정보취급방침
Google Analytics (Advertising)
오토데스크는 Google Analytics (Advertising)가 지원하는 사이트에 디지털 광고를 배포하기 위해 Google Analytics (Advertising)를 이용합니다. 광고는 Google Analytics (Advertising) 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Google Analytics (Advertising)에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Google Analytics (Advertising)에 제공하는 데이터를 사용합니다. Google Analytics (Advertising) 개인정보취급방침
Trendkite
오토데스크는 Trendkite가 지원하는 사이트에 디지털 광고를 배포하기 위해 Trendkite를 이용합니다. 광고는 Trendkite 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Trendkite에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Trendkite에 제공하는 데이터를 사용합니다. Trendkite 개인정보취급방침
Hotjar
오토데스크는 Hotjar가 지원하는 사이트에 디지털 광고를 배포하기 위해 Hotjar를 이용합니다. 광고는 Hotjar 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Hotjar에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Hotjar에 제공하는 데이터를 사용합니다. Hotjar 개인정보취급방침
6 Sense
오토데스크는 6 Sense가 지원하는 사이트에 디지털 광고를 배포하기 위해 6 Sense를 이용합니다. 광고는 6 Sense 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 6 Sense에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 6 Sense에 제공하는 데이터를 사용합니다. 6 Sense 개인정보취급방침
Terminus
오토데스크는 Terminus가 지원하는 사이트에 디지털 광고를 배포하기 위해 Terminus를 이용합니다. 광고는 Terminus 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Terminus에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Terminus에 제공하는 데이터를 사용합니다. Terminus 개인정보취급방침
StackAdapt
오토데스크는 StackAdapt가 지원하는 사이트에 디지털 광고를 배포하기 위해 StackAdapt를 이용합니다. 광고는 StackAdapt 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 StackAdapt에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 StackAdapt에 제공하는 데이터를 사용합니다. StackAdapt 개인정보취급방침
The Trade Desk
오토데스크는 The Trade Desk가 지원하는 사이트에 디지털 광고를 배포하기 위해 The Trade Desk를 이용합니다. 광고는 The Trade Desk 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 The Trade Desk에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 The Trade Desk에 제공하는 데이터를 사용합니다. The Trade Desk 개인정보취급방침
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

정말 더 적은 온라인 경험을 원하십니까?

오토데스크는 고객 여러분에게 좋은 경험을 드리고 싶습니다. 이전 화면의 범주에 대해 "예"를 선택하셨다면 오토데스크는 고객을 위해 고객 경험을 사용자화하고 향상된 응용프로그램을 제작하기 위해 귀하의 데이터를 수집하고 사용합니다. 언제든지 개인정보 처리방침을 방문해 설정을 변경할 수 있습니다.

고객의 경험. 고객의 선택.

오토데스크는 고객의 개인 정보 보호를 중요시합니다. 오토데스크에서 수집하는 정보는 오토데스크 제품 사용 방법, 고객이 관심을 가질 만한 정보, 오토데스크에서 더욱 뜻깊은 경험을 제공하기 위한 개선 사항을 이해하는 데 도움이 됩니다.

오토데스크에서 고객님께 적합한 경험을 제공해 드리기 위해 고객님의 데이터를 수집하고 사용하도록 허용하시겠습니까?

선택할 수 있는 옵션을 자세히 알아보려면 이 사이트의 개인 정보 설정을 관리해 사용자화된 경험으로 어떤 이점을 얻을 수 있는지 살펴보거나 오토데스크 개인정보 처리방침 정책을 확인해 보십시오.