AU Class
AU Class
class - AU

Generative Design Product Briefing

이 강의 공유하기
동영상, 발표 자료 및 배포 자료에서 키워드 검색:

설명

This session is all about what's new and what's coming for Autodesk Generative Design technology. Join us to learn about the most recent developments, how you can try the technology, and what will be coming in the near future for generative design and automated manufacturing.

주요 학습

  • Discover what generative design is and what it is not
  • Discover the latest in generative design technology
  • Understand the future directions of generative design technology
  • Discover how to automate manufacturing using Autodesk's Generative Design techniques

발표자

  • Doug Kenik
    Doug Kenik is a Product Manager for generative design strategies within Autodesk, Inc. He holds both an MS and a BS in mechanical engineering from the University of Wyoming, where he spent his graduate career developing high-fidelity micromechanics models for composite material simulation. Prior to working at Autodesk, Doug was a developer and application engineer at Firehole Composites, where he helped implement new technologies for composite simulation and define next-generation enhancements for use within existing products.
  • Michael Smell 님의 아바타
    Michael Smell
    Mike is a Sr. Product Manager on the Fusion 360 team at Autodesk. He has been working on Fusion 360 for nearly 7 years and is currently responsible for the Generative Design portfolio. He has previous experience as a Technical Account Manager in Autodesk’s Manufacturing Named Accounts program, where he was working with customers to help them identify and solve business challenges with Autodesk solutions. Mike has spent nearly 17 years in the CAD and CAE industry, starting his career at Algor, Inc. in 2006, eventually being acquired by Autodesk in 2009. Mike holds a bachelor’s in Mechanical Engineering from the Pennsylvania State University, a master’s in mechanical engineering from the University of Pittsburgh, and has completed a certification for Machine Learning in Business from the MIT Sloan School of Management. Mike has been a regular presenter at Autodesk University since 2009.
Video Player is loading.
Current Time 0:00
Duration 59:20
Loaded: 0%
Stream Type LIVE
Remaining Time 59:20
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
    Transcript

    VIK VEDANTHAM: Well, thank you for attending what feels like almost day three right now. But it's really only day one of our fantastic AU here. Thanks for making it. Generative design, obviously, is a topic that's near and dear to a lot of us here. I'll quickly introduce myself and our team that's out here. We're all going to take turns in taking the stage in what's going to feel like a five-minute discussion.

    So my name is Vik Vedantham. I'm part of the Fusion 360 business strategy team. So specifically, I focus on generative simulation in some of our manufacturing technologies. I'm going to quickly allow everybody to introduce themselves. And then, we'll get things started. So Morgan?

    MIKE SMELL: My name is Mike Smell. I'm Product Manager on the Fusion 360 team focused on generative design and simulation.

    MORGAN FABIAN: My name's Morgan Fabian. My team does AI projects for the Fusion team. I'd been working a lot with these all stars on the generative design product.

    DOUG KENIK: Hey, everyone. I'm Doug. I am the Product Manager for generative design within Autodesk. So thanks for coming out. We also have Brian over here. Brian Frank is part of this presentation as well. He's the go-to guy for all things generative. So if you don't like anything about this presentation, he's the guy to talk to.

    BRIAN FRANK: That's why I'll be in the back.

    VIK VEDANTHAM: I'll share the blame with you. And then later on, during this discussion, we'll also have a special customer feature. We'll talk more about that as well. OK.

    So, I'm going to kick off with a quick safe harbor. Towards the end of this presentation, we will touch on a few futures, just to give you a sense of where this technology is headed and some of the topics we're thinking through. The standard safe harbor clause applies, so please be mindful that we're shouting some things that may or may not eventually make it into the product. But it's really good for you guys to get your eyes on it and perhaps even take the opportunity to give us some feedback on some of the content. All right?

    So I'll begin with the whole premise of why Autodesk is in the business of generative design. Now, you've heard a lot of the discussions right from the main stage presentation this morning, but you can really sum it up in terms of two big things that we're noticing in the community. Right? One is the topic of scale, and the other one is the topic of scope. All right.

    In terms of scale, these are very common numbers that pretty much every industry talks about. The need for products is growing on a day-to-day basis. Simple numbers that you can think about is twice growth, three times the growth, and five times the growth. And these are not just business expectations, but actually growing needs in the community per se. The other topic is one of scope. Scope is interesting, because one number that we feature out here is just in a small segment of the automotive industry. We see that the need for new products is growing tremendously. And the problem is just not the rate of introduction of new products. It's actually the bespoke nature of these new products, right? It's almost like every customer wants their own design per se.

    So the challenge that the manufacturing community faces is, how do you deal with the concept of growing scale, while at the same time trying to deal with the changing needs of the customer? I will also throw in one other market trend that we are noticing, which is the evolving nature of work. Now, you had a couple of presenters on the stage talk today about how the nature of work is going to change the power of automation and things like that. So we are very cognizant of the industry trends in terms of robotics, in terms of automation, in terms of cloud as a computing technology, as well as the power of things like artificial intelligence and machine learning. Right? So we're trying to bring all of these things together for us to help deliver technologies out in the market.

    Now the reality, though, is that there is a fourth dimension to this problem. Now, as the end customer's needs are growing, the reality is that the vast majority of how we design products has not changed that much. Right? So it's a very linear process, if you think about it. You've got a design, or a design engineer, or a design team that's thinking about answers to a design problem. And in most cases, they are operating within their domain knowledge, their level of expertise. And they're coming up with a few concepts within a defined time period. And of course, that evolves through the process of design evolution. It goes through manufacturer ability constraints, perhaps some validation, and then eventually it makes it out of production.

    The key thing here is this all costs money, right? I mean, the entire process-- this linear process. And I'm representing it in terms of two vectors, if you notice. One axis is really the explorative capabilities of the team. And the bottom axis is really how long it takes for that entire process to happen. Now, this is a typical representation. And some of your teams may be more progressive in thinking about how to disrupt these paths.

    But the problem is, if you struggle this against those growing needs that we spoke about earlier, the reality is this actually has a lot of burden on the manufacturers themselves, right? It impacts their business bottom line, your business bottom line in terms of revenue, as well as internal costs. If you're not innovating at a reasonable pace to stay ahead of market expectation and to deliver customer needs, that's going to impact the ability to actually differentiate yourselves in the market. And, of course, it means internal costs. Because the moment you start going down that waterfall, the cost of trying to bring in change starts to grow exponentially.

    Now, the good news is that we have the ability to influence this product development process. Right? We can take advantage of technologies to actually change the game. And those technologies are already in play. And we're going to talk about generative design in that context.

    The reason we talk about this is because there is no one single solution to a design problem. Every design challenge that a manufacturing team is going to face has multiple answers. Most cases, you're kind of zeroing in on what you think is the best answer within the given time window. And you come out with what is perhaps an incremental change to the design. What generative design attempts to do is to actually expand that horizon so that it presents all possible design solutions to a given problem. We call this a design exploration.

    Now, our objective is that we change this game in terms of that same two-vector representation. So if we can take advantage of cloud computing, and we can present all possible design choices early during the design phase-- so this is when you're ideating or when you're conceptualizing-- we've taken all of those constraints that you would impose and come back with multiple answers to the design problem. And we do it in a process and a performance aware state, meaning we are aware of what manufacturing methods you have at your disposal. We are able to then, very quickly, fast track and get you to that production phase. Because you have now taken into consideration the manufacturability and the product performance and longevity upfront, during the design phase itself. This really results in two gains. One of them is you've expanded your exploration arm. So now, you're rapidly innovating very early in the design phase. And of course, because you're now fast tracking the process, you end up with a ton of productivity gains.

    Tons of strategic advantages for you as a company. We talk about new product innovation. We talk about a ton of activities. We'll actually present a couple of examples today of what's working. We help fast track the world of the convergence of design and manufacturing, as we start taking into account design and manufacturing processes upfront. And finally, it also gives you the ability to explore what other manufacturing methods could be used. Or what other materials could I potentially use to solve that design problem?

    So I'm going to stop here with the high level vision. And I'm going to turn it over to Mike. Mike's going to take you through the more tangible way to take this abstract information. He's going to show you the product, and then, we'll use that as a way to kind of guide towards the rest of the conversation. Mike?

    MIKE SMELL: Thanks, Vik. So as Vik said, we're going to take these high level concepts around generative design and look at how we've brought this to market so that folks can start to take advantage of that. So one thing I will point out is, you heard this morning there's a couple of different tracks where folks are exploring generative design, both in the manufacturing space and in the architecture, engineering, and construction phase. What we're going to talk about today is primarily focused on the manufacturing space.

    So we'll start by talking about Fusion 360 and generative design. So what we've went off and built is really about design exploration. This technology is, as Vik said, really suited for allowing you to explore all of the potential solutions that may exist for an engineering problem. What we're generating is pretty unique as well. What you're getting out of the system is multiple CAD-ready solutions. And CAD-ready is really important. Because what we've managed to do in the past year or so that we've been working on this and brought to market is something that's pretty special to just Autodesk and in the amount of editability that exists in a CAD model.

    If you're familiar with Fusion 360 and our T-splines functionality, what you'll see-- and I'll demonstrate this a bit-- is the thing that this system is generating, this organic, free-form shape. You're going to have the ability to push and pull and adjust the sizing of that. Make it more aesthetically pleasing to you, but also deal with some manufacturing things that may exist in that geometry, if you need to make those other tweaks.

    The last point that's really unique about this delivery mechanism inside of Fusion 360 is it's connected into the rest of what Fusion 360 is as a platform. You're right there with modeling tools. You're right there with simulation tools to validate downstream of how these designs perform. You're connected to the manufacturing workspace, rendering product documentation, all those sorts of things. So it's in one, cohesive experience.

    Now let's take a look at what that really means. So I've got a video here that I'll navigate through. So here we are. You see there's a new workspace inside of Fusion 360 called generative design. And this is where we will start the entire process of designing or setting up, with the intent to generatively design part of this handlebar assembly. And what you'll see in some other materials is a ride-on golf cart.

    So here, inside of the generative design workspace, we've got a modeling environment kind of nested in there, which is a safe space for you to go edit and change the design. We're developing our obstacles and preserves. That says where we need to keep material, where we need to not have material. We've got all of our tools for defining loads and constraints very easily in the context of the design. So if you've done a basic, linear static stress simulation in the past, this should all be very, very familiar.

    We talked about manufacturing aware. So we can look at unrestricted, additive methods, as well as subtracted methods for three and five axis milling. So the user can set up multiple configurations for how they may develop strategies for machining that part in the scenario for machining.

    We also have the ability to investigate multiple materials. So again, this isn't about just one solution, but many potential solutions that you as the user can consider. All of this goes off to our cloud to compute in parallel. We're looking at all of these obstacles. And in addition to that, we can start to do more what-if scenarios. So I talked about this kind of built in modeling workspace in the context of generative design. Here, we can say, well what if we give the solver some kind of basis to go on for the way it generates the shape? Once that's all off computing, we've got our explore environment that helps you look at all of these outcomes. Start to do some decision-making around what are the manufacturing methods, what are the weight behaviors, what are the structural behaviors. All of those sorts of things-- we can look at that in context of one another, and then choose which is the outcome that's most likely to solve our problem.

    Again, this is where I think that things get very, very interesting in what we're delivering with generative design. We're downloading that outcome. Here we are, back inside of the Fusion modeling workspace. And what you'll notice on the bottom here is the Fusion timeline. This is a fully parametric model where we've got true prismatic B-reps for the obstacles and preserves that were originally defining the problem. And then, the thing that the system generated is the T-spline body. And here's the area where we can go in and start to manipulate the form. We can fill holes. We can change it, make subtle tweaks to that based on how it needs to behave in our system. So a bunch of purpose-built tools to help you then kind of tweak and manipulate that design just a little bit more than what the solver may have done for you. So again, this is pretty breakthrough stuff here, as far as how editable this actually is. OK? All right.

    There we go. So I'm going to start off with one customer story. And then, as Vik mentioned, we have a special guest with us to talk about their application. I think there is nothing better than seeing how real customers are taking advantage of this tool and putting it into practice. So Penumbra Engineering is a customer of ours who's been using the generative design solutions inside of Fusion. And their application is quite unique, in that they're using a plastic type of part. The general envelope of the design is the area that they're preserving. And they're letting the generative design system go off and build the support structure internally to get the desired stiffness ratios versus the overall mass.

    So you can see here all of the webs inside of the parts have been generatively designed, rather than your traditional rib structures that you may model for a traditional plastic part. Now Vik, did we get the samples? Do we have the samples? OK. So if anybody is interested in these parts, we have 3D printed artifacts in the quad at the exhibit hall. That's all on that story. And I'll go ahead and introduce Paul Magee from Crown, the Director of Industrial Design. And they, too, have been using generative design in Fusion to do some unique things.

    PAUL MAGEE: Good afternoon. So this is Caleb Meyer. He's a member [INAUDIBLE] actually going to be able to speak more specifically around what you're going to see [INAUDIBLE] actual property. We haven't figured that out completely [INAUDIBLE] Is that better? Sorry. So, we [INAUDIBLE] for about six months now. [INAUDIBLE]

    How's that? All right. So we've been working with these guys for about six months now on generative design. And actually had some really positive outcomes right within the first half day of using the application. The problem with that was that everything we did, we can't show you, because it takes us a really long time to push things through our company. So we-- I mean, Caleb mad scrambled. We talked loosely about an idea that we said our company will never make this. So that's something we can show you guys. And so we created something. Caleb and I came up with an idea that said, Crown Equipment is largely a lift truck manufacturer. And we make a lot of very large, very heavy items.

    E-commerce is having a very significant impact on the type of work that we do. So if you think about the final mile or the final delivery, if anybody here live in like New York City or a major walking commuter city-- you ever seen the UPS truck has to sometimes park a quarter mile or a half mile away, and then figure out how to get packages to your house? That's not exactly easy. And then, how many of you are engineers? How many of you are designers? And by designer, I mean like went to art school or some equivalent. That seems like about the correct ratio.

    So we actually don't care quite as much about the how it's going to get made. We actually just care about having as many ideas as possible, very much like Vik's slide showed. So for us, it's all just about we have a really cool idea, and we want to be able to think differently than we typically could. And the very first time we used the application, it took about, literally, two hours to figure out that we could get outcomes that we never in our imagination could have possibly come up with. And you could argue, well this isn't real. And we're not going to actually print that. That's not what it was about for us. It was about being inspired to think differently about what we were going to create.

    So Caleb and I talked about developing an idea that would utilize something very much like a Segway drive motor. But what if I were to put that in some kind of a cart for a UPS driver equivalent, where it could haul 500 pounds. But it would actually remove the burden of carrying the weight around and physically adjust to them as they need it. So with that.

    CALEB MEYER: Yeah. So I'm an industrial designer, so I'm not really concerned with all the technical aspects of designing it or making it real. I'm more interested in a provocative concept, and how do I use this tool to create a lot of realistic solutions on the front end? So I didn't want to start out traditionally by designing it. I wanted to give it the least amount of information that I could to meet my problem constraints. And pretty much let the software do the heavy lifting. And as the software was generating results, what we started to do was add in different constraints and different load cases to start to manipulate the form of what we were getting.

    So what you're seeing in this slide is basically a basic setup of our project. So it's a basic two-wheel hand dolly you'd see on the back of a truck or a grocery store. And we have a starting shape there in the yellow. The blue is pretty much the 500-pound load constraint. We put those in different places across the file. The red is nothing goes here, and the green is simply let's kind of keep it within this form. Let's use this as inspiration to develop some of your solutions.

    Yeah. So this is a snapshot of some of the converged results. So you can see there's quite a diversity of results here, even within just the first file and some of the first studies that we started to do. So for this point, I think we spent about two hours creating the first model. And then after that, I think it took us about 45 minutes to get some of our first results. And once we started to get results in, then we instantly cloned our studies, added new constraints, and then started to get even more results. So it's just exponential and just really continues to grow.

    And so I should mention as well that our goal for this is to get the weight as low as possible. We're not really concerned on how we can manufacture at this point. We just want to be provocative and see how light we can actually make this thing.

    VIK VEDANTHAM: So one of the interesting things, Caleb, is you had to rethink about how you would approach solving the problem, right? Because you're not necessarily solving the problem. You're actually defining the scope of the problem. That's what you were going through. So how was the experience in trying to get up to speed on that?

    CALEB MEYER: It's kind of a mind shift, actually. It's really about not being concerned about what you're creating and more concerned about the problem you're solving. So you spend more time thinking about what an acceptable solution would be and what the requirements are to meet that. So once you have that pretty well at hand, then you can start to let the computer do all the calculations and things like that.

    VIK VEDANTHAM: Thank you.

    CALEB MEYER: Yeah. So these are kind of two of our final outcomes that you can see. Paul, how much was the blue one? I can't remember. That one was like two pounds. I think it was ABS plastic or polycarbonate. Nylon or ABS. And I think the other one was about 12? Yeah, it was 12. And these are going to hold 500 pounds of packages and be incredibly strong and rigid at the same time. These are really pretty preliminary. And we're going to keep going with this and actually take it through the whole design phase and see. At this point, we'll do the traditional design where we'll start sketching on it, and normalizing it, and figure out other ways to manufacture it.

    PAUL MAGEE: So for us, the part that's interesting about this technology is the idea that within under a business day, we have 45 valid ideas that literally came from you and I spitballing for about 30 to 45 minutes, actually going through old co-op work and thinking there was some interesting kernels in here. And frankly, the work that we'd gotten from the co-ops was not tenable. They were not valid ideas. But we just spitballed off of it. A Segway drive unit would fit in about that form factor, and we know this is a big upcoming problem. And what was interesting is, we did it just to be provocative, largely to help give Autodesk something that we could talk about that wasn't in violation of our intellectual property. Only to then show this to some of our vice presidents and have them say, well, we might actually want to do this after all.

    So it's interesting that we were trying to convince a lot of our internal engineering VPs about the value of this tool. And we're a very heavy steel and heavy manufacturing plate weld fabrication type of a company. And they didn't see the value of it. And we said, that took five hours. 45 ideas in five hours. And the simple reality is, it doesn't say you have to additive manufacture it. This is an inspiration to do it however you'd like. And then, you start to see the light bulb goes off. So, that's a small step in our journey.

    VIK VEDANTHAM: Thank you very much, Paul. Thanks, Caleb. I think that was a fantastic story.

    [APPLAUSE]

    So that's a great example. And I think Caleb and Paul made some astute observations, in terms of how you could use this as an inspiration to actually move downstream. Because you know, all said and done as you take this product to the next phase after you've conceptualized, you're obviously going to work through the design evolution process. But the nice thing is, this is already tried and tested as a first pass, in terms of manufacturability as well as performance. The nice thing is, you've also done a materials exploration exercise at the same time.

    So the next part of this discussion, I'm actually going to turn it over to Doug--

    DOUG KENIK: Yes, sir.

    VIK VEDANTHAM: --and Brian. And they're going to actually move into more of the future. So now that you've got a good basis for what generative is, we're going take a look at what the near-term future is. And then, Morgan will follow up with some of the more far-term thinking that we're kind of going through. All right? Take it away, Doug.

    DOUG KENIK: Thanks, Vik. Appreciate that. So I want to reiterate what Vik said. This is that part of the presentation where you can't hold us to anything that we say. It may or may not be true. I will show you some examples of some tech that we are working on. We have our engineering group here. They're doing a great job of continually pushing new tech out to solve real customer problems. So let's go ahead and dive into that. I'll do the near-term stuff, and then Morgan's going to go into the longer term aspect of this.

    So when we talk about the journey of generative design, we can break it up into four distinct workflows. So that would be problem definition, generative tech-- that is what is going on behind the scenes-- geometry editing-- so how can I actually use the thing? And then finally, manufacturing. I need to actually make this, right? So all of this is feasible in the Fusion 360 platform. And what I'm showing on the top of the screen is what we can do today. Right? So we have this in canvas experience for a problem definition in Fusion 360.

    Right now, though, all we can do is linear static solutions. Right? So that basically means small deformation. So we can't do nonlinear materials, high deformations, things like that. We absolutely know that these are areas that we need to look at. Because when you're talking about setting up a problem definition, and there's multiple goals and objectives-- small deformation with a static stress and factor of safety for stress-- failure is not always what we're designing for. It's evident. We know that. So we're working on things in that area.

    What are we working in that area? So that would be, how do we get users through the problem definition phase as quickly as possible? So has anyone in here used generative design besides Caleb? Like three or four hands. All right. So what we were alluding to when Caleb was talking is, when you're setting up these problems, your mind has to shift. It's not about you solving the problem. It's about you defining the problem. That is weird.

    So there's this thing called obstacle geometry. Right? That is, don't put material here, because I have something here. Or I need access to something. Says a big deal has anyone ever tried to create like 10 million cylinders in a design? It's freaking hard, right? So why don't we automate a lot of these processes? We can automate it by identifying geometry. And we can also automate it through machine learning and artificial intelligence.

    So what we're doing is automating the obstacle creation. And we'll show that in the next slide. We're also setting up pre-checks within the software. So as you're going along the software, we're letting you know, hey you've reached a milestone here. Please continue. Has anyone put a load or constraint on a body before? It's weird, right? If you're not used to FEA, and you're not an engineer, it can be hard to describe that problem. We're trying to lower that barrier to entry.

    So the other thing that we're doing is what we're calling outcome previews. So right now, when you say, hey, show me all these results. We send everything up to the cloud. It comes back in a couple hours. It may or may not solve the problem that you wanted, because maybe you forgot a couple of pieces of obstacle geometry. Well what we can do now is, we can actually show you where material is going to go as you're setting up the problem. So you can say, hey, that looks good. Fire off that generative solve.

    And then last but not least on that front, we're also adding a whole bunch of additional physics. Right? So we have a bunch of projects in the works to consider. Buckling-- which is, if I have a long spaghetti noodle, and I push on it really hard, it's going to snap. Right? So we're adding buckling. We're adding displacement constraints. We're adding fluids. We're looking at that, which is heat exchangers, might be pressure drops. Right? So all of those kinds of goals, multi-objective solves, can be built into this platform.

    On the generative tech side right now, what you can really do is you can explore space via materials and manufacturing methods. All right. So let's say I have multiple materials, and then I also have the ability to look at unrestricted-- which is hey, I don't care how you manufacture this. Just show me cool stuff. We can also consider additive manufacturing. And then, we can finally consider three and five axis milling. All right? So you can tell the system, I'm going to mill this using a five axis machine. And we'll give you a result that can be milled using a five axis machine.

    But those obviously are not all the manufacturing processes that are available. So what are we looking at? Two and two-and-a-half axis milling, right? We should be able to produce results for that. We should be able to produce results for casting. We should be able to produce results for injection molding. All of these are feasible. Our teams are very capable to do this, and this is what we're looking at. We're also looking at beams and trusses, so weld nuts on the manufacturing side.

    So there's a lot more in here that I want to cover, too. So the big one I think that a lot of us are really interested in, and it's a hard problem to solve, is costing. So that is comparing designs across multiple manufacturing methods with multiple materials. And having cost is one of those filters, or defining a cost barrier when you set up a problem. This is a big, big deal when you're trying to select an outcome to take forward in the process. All right.

    So I'll leave it there. Morgan will cover artificial intelligence and machine learning. And then on the geometry editing side and manufacturing side, obviously we're going to continue to build out those workspaces. So as we add new physics, as we add new capabilities for manufacturing constraints, those will be pushed further downstream in the process. But you can actually create those parts and manufacture those parts.

    So I'll show you a couple examples of what we're working on. I've hit on a lot of these. But it'll be fun, nonetheless. So automatic obstacle creation-- basically, this green stuff is what you say, I need this in the design. That is connecting to something. You should not have to create all of these red pieces of material, because those aren't in your design. Right? It doesn't make any sense. So what we can do is, we can automatically create all that red material interacting with you to tell us how that goes. On the previewer-- I think I got it.

    So again, as you're setting up the problem, we're showing you where material's going to go. It's telling you it's interacting with you, saying I'm going to go in this direction. Is that OK? Once you say yes, you just hit this generate button. And off we go. So we're adding the setup guide, so the guided workflows. This is like your personal assistant telling you, hey, you've gotten to this place. Keep moving forward. Milling constraints-- so we have two-and-a-half axis. Two axis is what we're working on as well. And then, here's an ideation of some welding frame structures as well.

    So these are all proof of concepts that we've actually mocked up, integrated into the technology, and were testing internally to see if it's feasible and provides value to use these risks. We'll take questions on that afterwards. I'll turn it over to Morgan.

    PRESENTER: Now while she's getting set up, I would just say remember this is very early days in generative design. So there is a lot of things that we'll continue to do, especially as we gather feedback from [INAUDIBLE] for the bits of the workflow that you're finding frustrating. As you continue to give us that feedback, we can continue to build a better product. [INAUDIBLE] More static, sorry.

    [INAUDIBLE]

    MORGAN FABIAN: The thing that we're most focused on [INAUDIBLE] in the machine. And the reason we think this is really where generative design is going and where the future of Fusion is going is because of two things. One is the amount of data that is now being produced by the Fusion platform and generative design. We've never had more information about the types of problems you're looking to solve and the number of different outcomes that there are. And the second thing is all the advances that are happening in machine learning. And so when we think about when the software can learn about you, and your organization, and understand your needs, it can be a much better and more effective partner, and assistant, and what have you.

    When we're looking at moving from a design tool to a design partner, there's two streams that we're thinking this machine learning revolution is going to affect, both in terms of what you're working on as well as how you're working. So in the area of what you're working on, the actual products that you're producing, there's a few different categories that machine learning has a really big impact. An area like recommendations-- as you give us more information about the things that you're looking at and interested in and things you're not looking at and aren't really interested in, the system can provide much more productive recommendations and help you parse through that data. Netflix actually does something like this really productively. They look at what you're watching and provide you recommendations based on those things, but will also provide you with things that might not originally be coming across your typical day, might surprise you. Show you different things that your peers are looking at, et cetera.

    The second category is design guidance. So as you're maybe taking a design from a generative outcome and making some edits, the system can provide you some feedback in terms of the performance of those edits. It can maybe provide you some recommendations for manufacturing and so on. In the last one-- and I'll go really deep on this one-- and in style collaboration, but this system actually having a more of a conversation on getting closer to the actual style and brand that you are trying to produce in your final product.

    In terms of how you're working, we think that this partnership between the human and machine will make the whole process of designing and making much more efficient and easier and more creative. So the first one is around design discovery. So with all these different options that it's now presenting you, you have to have new tools to make trade-off analysis and go through this process of creating insights, so the decisions that you're making and the impact that is having on your business. The second is, can we teach the system things that we don't want to do, and offload repetitive or non-value add tasks to the machine and have them then take that on? Things like automating obstacle creation, as Doug was talking about earlier. And the last one is around upskilling. So as the system sees the types of features and workflows that you're doing, can it recommend things that it's seeing? You know, either other people in your organization are working on, and you're not working on, or help you grow as a particular user.

    So to dive into two of those examples-- the first one is around what you're actually working on, what you're producing, and the final product, and how the system is going to help you get there. So Google actually released this really interesting paper about two years ago or so called "Deep Dream." And what the network does is, it's learned how to take an image-- such as this man in the upper left hand corner-- in a particular style and make incremental changes to that image, and recognize important features of the image so that the integrity of the image is still intact. So you can still tell that it's a man. Looks exactly like that man in the original photo, but it has a particular style added on to it. Same thing in the hummingbird example.

    So when this paper was published, our artist research team said, can we up the ante a little bit? Can we look at more complex problems and do this for 3D geometry? So we've a project going on in our Autodesk research organization, where we are taking generative design outcomes and applying a particular style to it. So this might be something that you would want to do to address a particular customer need, to apply a particular brand, style to it, or even to look at a particular manufacturing method, and kind of continue down that path that maybe has specific tools that you're really looking at.

    The second example I'm going to talk about is it design discovery. So one of the biggest challenge right now with generative design is it's just a new workflow. You've gone from maybe evaluating two, three, four design options to potentially dozens, if not hundreds, of design options. You're having to parse through all this information, and you need new tools to do this. Because when you come across one particular design that you're interested in, it's really difficult to find which ones are similar, maybe with a different material, in the pack of these things. Or even just get a comprehension of the entire design space.

    So if we look at this in a particular workflow, say you're designing a shelf bracket like we have here-- where you have your bolts defined and all the shelves and wall defined. And it produces over 200 different options for you. It's really overwhelming to kind of go through this process just visually. So what we've done is actually trained a network to identify important features of all of those particular designs and help categorize them for you. It really reduces the cognitive load. You can quickly filter out things you're not interested in and find things that are important to you. Of course, this lends itself really well to some of the other projects we want to do around providing recommendations and seeing things that you want to just completely exit out of your trade-off analysis and things that we can bubble up to the top as important to you.

    And that's it for AI. Vik?

    VIK VEDANTHAM: All right. Well, thank you so much, Morgan. Appreciate it. So I'm going to close this session out. And then, we'll have time for plenty of questions. I'm hoping there's some ideas and thoughts already in the crowd.

    So a couple of things, right? First off, our goal at this session was really to educate you more about what our broader vision and thinking is that on what generative can do, what is this paradigm shift about in terms of how these disruptive trends can be met by changing the product development process itself. You got a glimpse into one of our customers. So thanks, Paul. Thanks, Caleb for making a special appearance and sharing your thoughts and comments about the technology. We also got a chance to deep dive a little bit into the futures-- the more near-term things that we're looking at, as well as like Morgan summarized some of the more far field items that we're also researching at the moment.

    So in closing, I want to give back a little bit of topics to think about. And kind of just to reimagine, right? So the concept that we're trying to introduce is really one of design exploration, the ability to go out there and search for multiple results, multiple outcomes, to empower engineering and design teams with the ability to crawl that entire design space for solutions. And do that very rapidly, because of the power of parallel computing. Right? And this is a classic merger or convergence of technology meeting ideation or creativity that we're trying to inspire or draw inspiration from. So think about what this could mean for you, in terms of being able to explore. Because exploration translates into innovation. And innovation means competitive advantage. So there's a lot of gains when you think about it from that aspect.

    The other aspect of this conversation is one of productivity, right? This is also about how you can shorten that time to market. So that you can react faster, you can get more products out to market, and you can actually help more customers get to more solutions. So I'd encourage you to think about these two topics. And please do reach out to any of us out here that have been thinking and dreaming about the possibilities for quite a few years. We're more than happy to get into conversations on these topics.

    The other piece that I think everyone touched on-- Doug touched on this. Mike spoke about it, as well as Caleb did. It's the topic of how to design differently, when you think about generative. I just want to bring that back home one more time. I'll give you an analogy to think about. So if you had to write an essay about, let's say, a couple of people falling in love. Right? So right now, you're the source of the idea. You're writing the entire narrative. You're writing the story. And you're using Microsoft Word as a documentation tool that's capturing that story. So that's exactly what we're doing in terms of product development, if you think about it, right? What you're doing is, you're the source of the idea. You're actually solving the design problem in your minds. And then, you're using a CAD tool, or a CAM, or a PLM solution as a documentation tool so that you can then realize that in co-production.

    We are at the cusp of changing that fundamental concept, right? We at a paradigm shift. And we don't usually talk about paradigm shifts very lightly. But this is indeed true. What we're saying is, rather than actually writing out that essay, what you are describing to the computer at this point is really that you want two people to fall in love. And you want that to happen maybe in Japan. You're kind of describing those constraints. And you're telling the computer to come out with all of these varieties of stories. And you're saying, oh, by the way-- keep it to 500 words as well. Right? So you're imposing some constraints on it. And it's coming out with all of these stories. What's that, Brian? See? There you go. It's now public.

    But that's the idea, right? And that's what we're trying to do-- describe the design space and describe the constraints that it's dealing with. And let the computer go out there and explore and present all of those outcomes, so that the engineering teams are empowered with data. I mean, just like Scott Borduin put it out at the meeting this morning. He said, data is a positive sum gain. You know as you get more data, you get more insights. Because then, you can improve your products or make it better. And that's exactly what we're talking about. We're giving you more data so that you can then do the trade-off analysis on multiple vectors. Decide which outcomes, or outcome, makes the most sense to then progress towards the future. [INAUDIBLE]

    PRESENTER: Let's hit this real quick too, though. That last slide.

    VIK VEDANTHAM: Two more slides. And then, we'll stop right there. So a couple of things. We're all out here all through this week. We'll have plenty of opportunities to engage and connect. You know, we're more than happy to hang out here and discuss with you. But there's a couple of things. One of them is, we do have multiple ways by which you can sit down with any number of us and actually engage with us. Please do. Come to our feedback workshops, as well as Idea Exchange-- this personnel out there that's focused on generative design that you can ideate with. You can sit down with them and maybe express your design problem or your design challenges. And that gives us the ability to take that, and then digest that as we get into our internal strategy meetings. So that's an excellent way for you to influence where we are going with this direction.

    The other piece I would also call out is there are plenty of workshops, as well, by the way. So there's hands on workshops that's happening through the course of the next couple of days that you can participate in. There's, perhaps, too many generative design sessions to count, actually. I was just telling Paul that I think 14% of the classes this year is all about generative. But that's the spirit of it. This is an emerging trend, because the problem is real. Customers are facing challenges. We have had so many conversations that's helped guide our direction, per se. So please engage with us. Let's make use of this opportunity to kind of collaborate and make the technology better. All right?

    With that, we'll close off on this session. We're open for questions now. You've got the entire use of the product management team, the strategy team, customer [INAUDIBLE] you may have. All right. We're starting off with a question on this slide.

    AUDIENCE: Thank you very much. First of all, I would like to ask you a question. I don't know if it's possible to use generative design in some traditional materials, using construction like concrete. And if not, when do you think that is going to be able to be applied?

    VIK VEDANTHAM: So let me make sure I got the question. So you're asking about the exploration of materials in construction, like concrete you said?

    AUDIENCE: That's right. Yes. In construction, the most common material that we use is concrete.

    VIK VEDANTHAM: Right.

    AUDIENCE: So for example, if you tried to use generative design in construction, that ability or the possibility of using generative design with this material could be available. I don't know if it's available or not. And if not, when it's going to be available?

    VIK VEDANTHAM: Got it. OK. Doug, do you want to--

    DOUG KENIK: So there actually is a class later in the week that talks about using concrete in generative design of a structure. So it's feasible. We used to have concrete in there, but we took it out for some reason. So it's a market that we'll probably survey. I would say that probably falls more on the AEC side of things. And that group will likely investigate that. But for 3D printing of concrete structures, it's completely feasible. Right?

    VIK VEDANTHAM: it's an interesting point. Actually, one of our customers that we found recently is trying to use this generative design technique. The way we were aiming at the product design market with this, they were actually using it for infrastructure development, like bridge development. Right? So it's amazing to see the creative applications of technology like this to go out there and explore for various design or construction. Yes?

    AUDIENCE: What's the selection it's trying to do for any different designs? Like what kind of product-- is it completely random? Or is it like [INAUDIBLE] What's the selection product you [INAUDIBLE]

    VIK VEDANTHAM: All right. So the question is, what is the underlying algorithm or selection strategy for the various designs? Is it based on any sort of genetics or biologically inspired solution? So I'm going to point back to you guys for this.

    DOUG KENIK: That's confidential. But it's true.

    [INAUDIBLE]

    Some of the other aspects of [INAUDIBLE]. So all those things kind of play into how this shape is synthesized. It's all based on different optimization techniques and strategies.

    VIK VEDANTHAM: And one of the things we have done a pretty decent job is, over the years, we have actually spoken about what was called Project Dreamcatcher. And Project Dreamcatcher was the inspiration that helped us move in this direction. So there's actually a good amount of info on that topic as well. As Doug say, some of it, we can't really openly talk about. But we're happy to get into a one-on-one and collaborate further as well.

    DOUG KENIK: Those things will continue to evolve. So it's really hard to discuss it in the context of like, what are you doing now? Because at some point, it's going to be doing whatever it wants to do. So me telling you what it's doing wouldn't even be feasible.

    VIK VEDANTHAM: Got it. Thank you. Yes, Joe? We've got a question.

    AUDIENCE: My question is, you showed us your design in Fusion 360. Can you comment about generative design in other Autodesk products?

    VIK VEDANTHAM: Other Autodesk products-- most of them come with Fusion 360. So there's your answer. That's the quick answer, right? The long answer is, we've been thinking quite a bit about that.

    BRIAN FRANK: I think it's all, right now, predicated on the ecosystem we want to build, right? So we generate a lot of data. We want to do that in the cloud. There's a lot of downstream benefits that customers will get doing it that way. So we've chosen Fusion 360 as kind of our vehicle to expose generative design to the market. There's a lot of advantage in doing that. One, kind of an easy use and set up. All the different capabilities that you have on the front end of the process, and then everything that you get on the back end of the process, once you start selecting outcomes. Right?

    So when you find one, or two, or four of those designs and you want to do something with them-- you bring them back into Fusion. Now, you have the ability to edit the geometry. Some of our other tools don't have that capability. You have the ability to go from beginning to a manufacturing workspace, a CAM workspace, an additive workspace in Fusion. So it's about the whole ecosystem that can be data traveling through the pipeline. And so that's one of the reasons we focused on Fusion as a vehicle for those generative designs.

    VIK VEDANTHAM: The other big advantage, of course, is cloud as being the platform. Fusion lends itself as the most homogeneous or easy way to describe all of this. It's just massive cloud computing going on behind the scenes to be able to spin up as many solutions as it does. And so, it just naturally lends itself as a great way for us to expose that to customers. And so we'll continue to build on that strategy, per se. Yeah? All right. We got plenty of time for a few more questions, actually. We got about 10 minutes. So let's go for a few more. Yes? Question here.

    AUDIENCE: The example you mentioned regarding using Word to document a story is a very interesting analogy. The issue with aesthetics-- so, many of the designs that actually has been refined with the generative system had an aesthetic dimension. And even the industrial designers, they take this into consideration. How does such a system would make a decision regarding the aesthetics of the results?

    VIK VEDANTHAM: Right. Just to be super clear, are you talking about currently? Or how do you expect that it will take into account aesthetics in the future? Just to be clear.

    AUDIENCE: I'm not sure if there is a system in place right now that can evaluate such a thing, or if it's just an open question for the future.

    VIK VEDANTHAM: So I'll begin that conversation. It's a very interesting question. It's the notion of, can it take into account aesthetics during the design process itself? Because as an industrial designer, perhaps you are thinking about not just form, fit, and function. But it's also the ergonomics, the human factors aspect of our design, per se. Right? And I think Caleb also touched on it a little bit.

    But my initial thoughts are that right now, the way you would go about using it is very much a manual process. Right? So when it gives you all your outcomes, all your designs, one of the things that we didn't mention is you can actually see the time history, or the evolution of each outcome. So essentially, you can hit the rewind button. And you can see how it actually though about arriving at that final converged result. What we're noticing is that there are customers that are actually rewinding it to a previous instance. Because they're, perhaps, looking at it and saying, the converged outcome is maybe not perfect for what I need. But I actually noticed that a few iterations before is actually more aesthetically or organically appealing for me. And the beauty of the system is, you can actually decide to take that design forward. You're not locked to the final outcome, which is another key differentiator which I didn't get a chance to talk about. But you do have the ability to get down and use that aesthetics aspect. It is a manual process right now. But I think we are thinking about some ideas, right? So Morgan, I don't know if you've got some further thoughts to add on that.

    MORGAN FABIAN: Yeah. I mean, on the product that I talked about and researched, I think that we're looking at both ways to do post aesthetic interpretation. So once you have an outcome, giving the system an aesthetic that you prefer, and then having it do some sort of interpolation between the two. But also, I think there's ideas around doing it as a constraint in the first place. So, giving the system-- before it even generates the shapes-- some sense of the aesthetics that you prefer. But it's in research.

    VIK VEDANTHAM: Thanks, Morgan. Appreciate it. All right. Next question? Was that helped? Did that answer your question?

    AUDIENCE: I think it did. [INAUDIBLE]

    VIK VEDANTHAM: Yes. Yeah. But I'm glad you liked my story, though. All right. Yes? Question?

    AUDIENCE: Do you [INAUDIBLE] mind or tools that you guys [INAUDIBLE] and [INAUDIBLE]

    VIK VEDANTHAM: Great question. So I'll repeat the question. The fact that we're creating this paradigm shift is great. But the question is how do you discuss [INAUDIBLE] describing those constraints like the loading conditions and the primitive geometry that is required to describe the design space can be a bit of a challenge. So the question is, are we thinking about anything in that front? Is that a fair way to put it? All right. Doug or Mike, why don't you guys take this question?

    MIKE SMELL: Yeah. So I think we're trying to tackle this problem from a couple of different vectors. So the first-- Doug talked about our focus on automatic geometry, obstacle, and preserve creations. So that's one vector. The other is, how do we walk users through the process of understanding that they even have to define loads and constraints? Because again, it is a bit of a challenge. So we're going to approach that on the vector of that guided workflow. So we're going to try to keep people on the rails as much as possible to get them through the recipe to be successful.

    The other thing that we've looked at quite a bit, and I think we've got a path towards, is most of what you've seen in the demonstration that I showed with the video-- a lot of our marketing materials-- it's all really at the component level. And sometimes, it's difficult when you think about, hey. We make excavators. And we know that there's so much load in the bucket. But we're looking to do some bracket in the middle of that system. It's pretty difficult to figure out all the reaction forces of that system and translate to that. So we are looking at trying to be a little bit more open about how we support assembly contexts in the context of the problem definition. So rather than having to say, well, I've got to calculate all these complex reactions at this sub-component in the system, can we come back to a larger system that we actually know? And then, pick that component out as the thing we're going to design against. So there's three vectors, really, in how we're trying to help reduce the burden to get users to define the problem space.

    DOUG KENIK: And I think we can always infer some constraints or some of the setup conditions that a user intends. But on the other hand, a lot of the folks that we've talked to in the feedback that we've gotten-- they actually like the fact that they have to sit down and really think about the problem that they're trying to describe. It actually forces them to kind of focus on what is the problem that they're trying to solve, be able to input that into the system in an appropriate manner, so we can get a good response out.

    To what Vik was saying earlier, that's really the shift. I mean, every user that-- I think-- first tries generative design has to go through that change in perception. Because everything that Autodesk and all the competitors have done to date really have been around documenting a solution that you kind of have in your mind's eye. Right? You kind of think you know something's going to work. You kind of start to document it, model it. You can validate it, things like that. But really trying to describe the problem you're trying to solve is a completely different animal. And I think there's an intuitive leap that people-- once they make that-- then they are able to take advantage of the tool quite effectively to do that. Does that make sense?

    [INAUDIBLE]

    Right.

    [INAUDIBLE]

    Yeah. So I mean, you have the ability to validate these outcomes inside of Fusion as well. You have the ability to use Fusion to help set up the problem and understand what your constraints or your design space need to be. I mean, there's a lot of advantage there. There is a certain base level of knowledge that we expect a user to come to the system with. I understand the problem I'm trying to solve. I understand the loads that it's going to have to endure. I understand [INAUDIBLE] go through or not go through. I mean, those are things that just from an engineer [INTERPOSING VOICES] stakes. Now I think to your point, we can continue to make it easier to find out what those things need to be. But then, as these guys were saying before, [INAUDIBLE] just like [INAUDIBLE]

    BRIAN FRANK: --expertise of an engineer, an analyst, a manufacturing engineer-- eventually, a costing professional. All those things into one system that gives you that feedback [INAUDIBLE]

    VIK VEDANTHAM: Great. Well thanks for that, Brian. So we're going to wrap up this discussion. As I said, this doesn't have to stop here. We're all out here to answer questions and engage with you all. Come and talk to us, because that's how we get to know more about what you're trying to do and how we can help further, right? With that, we'll close the session. Thanks for joining us, and we'll be in touch.

    [APPLAUSE]

    ______
    icon-svg-close-thick

    쿠기 기본 설정

    오토데스크는 고객의 개인 정보와 최상의 경험을 중요시합니다. 오토데스크는 정보를 사용자화하고 응용프로그램을 만들기 위해 고객의 본 사이트 사용에 관한 데이터를 수집합니다.

    오토데스크에서 고객의 데이터를 수집하고 사용하도록 허용하시겠습니까?

    오토데스크에서 사용하는타사 서비스개인정보 처리방침 정책을 자세히 알아보십시오.

    반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

    이 쿠키는 오토데스크에서 사용자 기본 설정 또는 로그인 정보를 저장하거나, 사용자 요청에 응답하거나, 장바구니의 품목을 처리하기 위해 필요합니다.

    사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

    이 쿠키는 오토데스크가 보다 향상된 기능을 제공하고 사용자에게 맞는 정보를 제공할 수 있게 해 줍니다. 사용자에게 맞는 정보 및 환경을 제공하기 위해 오토데스크 또는 서비스를 제공하는 협력업체에서 이 쿠키를 설정할 수 있습니다. 이 쿠키를 허용하지 않을 경우 이러한 서비스 중 일부 또는 전체를 이용하지 못하게 될 수 있습니다.

    광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

    이 쿠키는 사용자와 관련성이 높은 광고를 표시하고 그 효과를 추적하기 위해 사용자 활동 및 관심 사항에 대한 데이터를 수집합니다. 이렇게 데이터를 수집함으로써 사용자의 관심 사항에 더 적합한 광고를 표시할 수 있습니다. 이 쿠키를 허용하지 않을 경우 관심 분야에 해당되지 않는 광고가 표시될 수 있습니다.

    icon-svg-close-thick

    타사 서비스

    각 범주에서 오토데스크가 사용하는 타사 서비스와 온라인에서 고객으로부터 수집하는 데이터를 사용하는 방식에 대해 자세히 알아보십시오.

    icon-svg-hide-thick

    icon-svg-show-thick

    반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

    Qualtrics
    오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Qualtrics를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Qualtrics 개인정보취급방침
    Akamai mPulse
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Akamai mPulse를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Akamai mPulse 개인정보취급방침
    Digital River
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Digital River를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Digital River 개인정보취급방침
    Dynatrace
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Dynatrace를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Dynatrace 개인정보취급방침
    Khoros
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Khoros를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Khoros 개인정보취급방침
    Launch Darkly
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Launch Darkly를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Launch Darkly 개인정보취급방침
    New Relic
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 New Relic를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. New Relic 개인정보취급방침
    Salesforce Live Agent
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Salesforce Live Agent를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Salesforce Live Agent 개인정보취급방침
    Wistia
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Wistia를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Wistia 개인정보취급방침
    Tealium
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Tealium를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Upsellit
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Upsellit를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. CJ Affiliates
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 CJ Affiliates를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Commission Factory
    Typepad Stats
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Typepad Stats를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Typepad Stats 개인정보취급방침
    Geo Targetly
    Autodesk는 Geo Targetly를 사용하여 웹 사이트 방문자를 가장 적합한 웹 페이지로 안내하거나 위치를 기반으로 맞춤형 콘텐츠를 제공합니다. Geo Targetly는 웹 사이트 방문자의 IP 주소를 사용하여 방문자 장치의 대략적인 위치를 파악합니다. 이렇게 하면 방문자가 (대부분의 경우) 현지 언어로 된 콘텐츠를 볼 수 있습니다.Geo Targetly 개인정보취급방침
    SpeedCurve
    Autodesk에서는 SpeedCurve를 사용하여 웹 페이지 로드 시간과 이미지, 스크립트, 텍스트 등의 후속 요소 응답성을 측정하여 웹 사이트 환경의 성능을 모니터링하고 측정합니다. SpeedCurve 개인정보취급방침
    Qualified
    Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

    icon-svg-hide-thick

    icon-svg-show-thick

    사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

    Google Optimize
    오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Google Optimize을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Google Optimize 개인정보취급방침
    ClickTale
    오토데스크는 고객이 사이트에서 겪을 수 있는 어려움을 더 잘 파악하기 위해 ClickTale을 이용합니다. 페이지의 모든 요소를 포함해 고객이 오토데스크 사이트와 상호 작용하는 방식을 이해하기 위해 세션 녹화를 사용합니다. 개인적으로 식별 가능한 정보는 가려지며 수집되지 않습니다. ClickTale 개인정보취급방침
    OneSignal
    오토데스크는 OneSignal가 지원하는 사이트에 디지털 광고를 배포하기 위해 OneSignal를 이용합니다. 광고는 OneSignal 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 OneSignal에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 OneSignal에 제공하는 데이터를 사용합니다. OneSignal 개인정보취급방침
    Optimizely
    오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Optimizely을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Optimizely 개인정보취급방침
    Amplitude
    오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Amplitude을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Amplitude 개인정보취급방침
    Snowplow
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Snowplow를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Snowplow 개인정보취급방침
    UserVoice
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 UserVoice를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. UserVoice 개인정보취급방침
    Clearbit
    Clearbit를 사용하면 실시간 데이터 보강 기능을 통해 고객에게 개인화되고 관련 있는 환경을 제공할 수 있습니다. Autodesk가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. Clearbit 개인정보취급방침
    YouTube
    YouTube는 사용자가 웹 사이트에 포함된 비디오를 보고 공유할 수 있도록 해주는 비디오 공유 플랫폼입니다. YouTube는 비디오 성능에 대한 시청 지표를 제공합니다. YouTube 개인정보보호 정책

    icon-svg-hide-thick

    icon-svg-show-thick

    광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

    Adobe Analytics
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Adobe Analytics를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Adobe Analytics 개인정보취급방침
    Google Analytics (Web Analytics)
    오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Google Analytics (Web Analytics)를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. AdWords
    Marketo
    오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Marketo를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Marketo 개인정보취급방침
    Doubleclick
    오토데스크는 Doubleclick가 지원하는 사이트에 디지털 광고를 배포하기 위해 Doubleclick를 이용합니다. 광고는 Doubleclick 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Doubleclick에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Doubleclick에 제공하는 데이터를 사용합니다. Doubleclick 개인정보취급방침
    HubSpot
    오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 HubSpot을 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. HubSpot 개인정보취급방침
    Twitter
    오토데스크는 Twitter가 지원하는 사이트에 디지털 광고를 배포하기 위해 Twitter를 이용합니다. 광고는 Twitter 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Twitter에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Twitter에 제공하는 데이터를 사용합니다. Twitter 개인정보취급방침
    Facebook
    오토데스크는 Facebook가 지원하는 사이트에 디지털 광고를 배포하기 위해 Facebook를 이용합니다. 광고는 Facebook 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Facebook에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Facebook에 제공하는 데이터를 사용합니다. Facebook 개인정보취급방침
    LinkedIn
    오토데스크는 LinkedIn가 지원하는 사이트에 디지털 광고를 배포하기 위해 LinkedIn를 이용합니다. 광고는 LinkedIn 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 LinkedIn에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 LinkedIn에 제공하는 데이터를 사용합니다. LinkedIn 개인정보취급방침
    Yahoo! Japan
    오토데스크는 Yahoo! Japan가 지원하는 사이트에 디지털 광고를 배포하기 위해 Yahoo! Japan를 이용합니다. 광고는 Yahoo! Japan 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Yahoo! Japan에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Yahoo! Japan에 제공하는 데이터를 사용합니다. Yahoo! Japan 개인정보취급방침
    Naver
    오토데스크는 Naver가 지원하는 사이트에 디지털 광고를 배포하기 위해 Naver를 이용합니다. 광고는 Naver 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Naver에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Naver에 제공하는 데이터를 사용합니다. Naver 개인정보취급방침
    Quantcast
    오토데스크는 Quantcast가 지원하는 사이트에 디지털 광고를 배포하기 위해 Quantcast를 이용합니다. 광고는 Quantcast 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Quantcast에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Quantcast에 제공하는 데이터를 사용합니다. Quantcast 개인정보취급방침
    Call Tracking
    오토데스크는 캠페인을 위해 사용자화된 전화번호를 제공하기 위하여 Call Tracking을 이용합니다. 그렇게 하면 고객이 오토데스크 담당자에게 더욱 빠르게 액세스할 수 있으며, 오토데스크의 성과를 더욱 정확하게 평가하는 데 도움이 됩니다. 제공된 전화번호를 기준으로 사이트에서 고객 행동에 관한 데이터를 수집할 수도 있습니다. Call Tracking 개인정보취급방침
    Wunderkind
    오토데스크는 Wunderkind가 지원하는 사이트에 디지털 광고를 배포하기 위해 Wunderkind를 이용합니다. 광고는 Wunderkind 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Wunderkind에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Wunderkind에 제공하는 데이터를 사용합니다. Wunderkind 개인정보취급방침
    ADC Media
    오토데스크는 ADC Media가 지원하는 사이트에 디지털 광고를 배포하기 위해 ADC Media를 이용합니다. 광고는 ADC Media 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 ADC Media에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 ADC Media에 제공하는 데이터를 사용합니다. ADC Media 개인정보취급방침
    AgrantSEM
    오토데스크는 AgrantSEM가 지원하는 사이트에 디지털 광고를 배포하기 위해 AgrantSEM를 이용합니다. 광고는 AgrantSEM 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 AgrantSEM에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 AgrantSEM에 제공하는 데이터를 사용합니다. AgrantSEM 개인정보취급방침
    Bidtellect
    오토데스크는 Bidtellect가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bidtellect를 이용합니다. 광고는 Bidtellect 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bidtellect에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bidtellect에 제공하는 데이터를 사용합니다. Bidtellect 개인정보취급방침
    Bing
    오토데스크는 Bing가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bing를 이용합니다. 광고는 Bing 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bing에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bing에 제공하는 데이터를 사용합니다. Bing 개인정보취급방침
    G2Crowd
    오토데스크는 G2Crowd가 지원하는 사이트에 디지털 광고를 배포하기 위해 G2Crowd를 이용합니다. 광고는 G2Crowd 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 G2Crowd에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 G2Crowd에 제공하는 데이터를 사용합니다. G2Crowd 개인정보취급방침
    NMPI Display
    오토데스크는 NMPI Display가 지원하는 사이트에 디지털 광고를 배포하기 위해 NMPI Display를 이용합니다. 광고는 NMPI Display 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 NMPI Display에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 NMPI Display에 제공하는 데이터를 사용합니다. NMPI Display 개인정보취급방침
    VK
    오토데스크는 VK가 지원하는 사이트에 디지털 광고를 배포하기 위해 VK를 이용합니다. 광고는 VK 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 VK에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 VK에 제공하는 데이터를 사용합니다. VK 개인정보취급방침
    Adobe Target
    오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Adobe Target을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Adobe Target 개인정보취급방침
    Google Analytics (Advertising)
    오토데스크는 Google Analytics (Advertising)가 지원하는 사이트에 디지털 광고를 배포하기 위해 Google Analytics (Advertising)를 이용합니다. 광고는 Google Analytics (Advertising) 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Google Analytics (Advertising)에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Google Analytics (Advertising)에 제공하는 데이터를 사용합니다. Google Analytics (Advertising) 개인정보취급방침
    Trendkite
    오토데스크는 Trendkite가 지원하는 사이트에 디지털 광고를 배포하기 위해 Trendkite를 이용합니다. 광고는 Trendkite 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Trendkite에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Trendkite에 제공하는 데이터를 사용합니다. Trendkite 개인정보취급방침
    Hotjar
    오토데스크는 Hotjar가 지원하는 사이트에 디지털 광고를 배포하기 위해 Hotjar를 이용합니다. 광고는 Hotjar 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Hotjar에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Hotjar에 제공하는 데이터를 사용합니다. Hotjar 개인정보취급방침
    6 Sense
    오토데스크는 6 Sense가 지원하는 사이트에 디지털 광고를 배포하기 위해 6 Sense를 이용합니다. 광고는 6 Sense 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 6 Sense에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 6 Sense에 제공하는 데이터를 사용합니다. 6 Sense 개인정보취급방침
    Terminus
    오토데스크는 Terminus가 지원하는 사이트에 디지털 광고를 배포하기 위해 Terminus를 이용합니다. 광고는 Terminus 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Terminus에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Terminus에 제공하는 데이터를 사용합니다. Terminus 개인정보취급방침
    StackAdapt
    오토데스크는 StackAdapt가 지원하는 사이트에 디지털 광고를 배포하기 위해 StackAdapt를 이용합니다. 광고는 StackAdapt 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 StackAdapt에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 StackAdapt에 제공하는 데이터를 사용합니다. StackAdapt 개인정보취급방침
    The Trade Desk
    오토데스크는 The Trade Desk가 지원하는 사이트에 디지털 광고를 배포하기 위해 The Trade Desk를 이용합니다. 광고는 The Trade Desk 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 The Trade Desk에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 The Trade Desk에 제공하는 데이터를 사용합니다. The Trade Desk 개인정보취급방침
    RollWorks
    We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

    정말 더 적은 온라인 경험을 원하십니까?

    오토데스크는 고객 여러분에게 좋은 경험을 드리고 싶습니다. 이전 화면의 범주에 대해 "예"를 선택하셨다면 오토데스크는 고객을 위해 고객 경험을 사용자화하고 향상된 응용프로그램을 제작하기 위해 귀하의 데이터를 수집하고 사용합니다. 언제든지 개인정보 처리방침을 방문해 설정을 변경할 수 있습니다.

    고객의 경험. 고객의 선택.

    오토데스크는 고객의 개인 정보 보호를 중요시합니다. 오토데스크에서 수집하는 정보는 오토데스크 제품 사용 방법, 고객이 관심을 가질 만한 정보, 오토데스크에서 더욱 뜻깊은 경험을 제공하기 위한 개선 사항을 이해하는 데 도움이 됩니다.

    오토데스크에서 고객님께 적합한 경험을 제공해 드리기 위해 고객님의 데이터를 수집하고 사용하도록 허용하시겠습니까?

    선택할 수 있는 옵션을 자세히 알아보려면 이 사이트의 개인 정보 설정을 관리해 사용자화된 경험으로 어떤 이점을 얻을 수 있는지 살펴보거나 오토데스크 개인정보 처리방침 정책을 확인해 보십시오.