AU Class
AU Class
class - AU

Building Patterns-Based Forge Integrations Using MuleSoft

Share this class
Search for keywords in videos, presentation slides and handouts:

Description

This class will provide an introduction to building integrations with the Forge platform using standard integration patterns in Mulesoft. Mulesoft is a lightweight event-driven Enterprise Service Bus that provides a robust encapsulation of the core integration patterns as described in the popular book Enterprise Integration Patterns by Hohpe and Woolf. We will demonstrate a custom Forge Anypoint connector built on the Mulesoft integration platform that allows for easy access to the capabilities provided by the Forge platform. Using this connector, Forge functions can be accessed directly from the business process flow editor in Mulesoft. We will go over how businesses can leverage this connector to build repeatable integration solutions using Forge and other enterprise applications. The class will provide a technical demo of a business process orchestration connecting a Force.com application and Netsuite with Forge. By using the Mule ESB for Forge integrations, enterprises can benefit by using standard canonical models to interface with multiple systems as well as leverage common enterprise services for monitoring and security. (Joint AU/Forge DevCon class).

Key Learnings

  • Learn how to use Forge API from an enterprise integration platform
  • Learn how to use integration best practices for connecting enterprise systems to the Forge platform
  • Learn how to use common services and canonical data models when interfacing with Forge
  • Learn how to develop a custom connector for Forge in MuleSoft

Speaker

  • Ravi Dharmalingam
    Seasoned software professional with experience in Integration consulting and Cloud based operations. Over 20 years of experience in all stages of enterprise software development and deployment in a wide range of industries.He is an experienced integration consultant having worked on helping customers successfully integrate enterprise applications across various industries. He has implemented legacy Enterprise Service Bus based integration solutions as well as modern cloud based systems and is proficient with using integration standards such as REST, SOAP and ODATA. He is focused on architecting and implementing patterns based solutions to integrate enterprise applications to help drive adoption and enhance overall value for customers.
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
  • Chapters
  • descriptions off, selected
  • captions off, selected
      Transcript

      PRESENTER: So we'll have somebody create an item in Salesforce. And then, let's say, a designer checks in a model into a file folder, the MuleSoft flow essentially translates that model and puts the link into Salesforce. So from Salesforce, you're able to correlate the model and view the model using a large-model viewer directly in Salesforce.

      And then the second demo is with creating a project in Salesforce. It, basically, pushes the project into BIM 360, again, using a Mule flow. And we'll highlight some of the patterns that we're using when you're building these flows and how easy it is for you to change stuff with this kind of architecture.

      And then, we'll will get into the actual code, how you can actually build the connector. And we can share the stuff we've done so far. And if you want to build your own connector, it is fairly easy. It's just something that you have to write, a wrapper that you have to write, on top of the API kit that's there in GitHub already. And then, we'll look at the Runtime environment and then wrap it up.

      All right, so integration architectures. So some of the key, primary architectures that you see for integrations right now are like a file transfer or a shared database. So those are still quite widely used. But the problem with that approach is it doesn't scale.

      And it usually is very tightly coupled. So if you have to make any changes, it locks you in into one architecture. You have to do quite a bit of rework, if you're trying to make changes to the system.

      The next one is point-to-point, where you can build a custom map in any programming language and, basically, use that to integrate. And again, this creates a tightly coupled model, which again, can work in some situations. But it can often lead to maintenance issues in the long run.

      So with that said, the messaging architecture is the most common architecture that's used across enterprises when you're talking about integrating large number of applications, managing a large number of integrations. So this kind of model essentially gives you the resiliency and scalability that you need when you work with large-scale integrations. So the Integration Bus, essentially, this is a core concept in pretty much-- There's probably 50 or 100 middleware products in the market that kind of support this architecture.

      Essentially, the core concept is, you have a group of applications that can work together in a decoupled manner. So changes in one app doesn't affect the other. So you can, basically, easily add additional components to the bus. All the other components don't get affected.

      So how do we use MuleSoft? So MuleSoft is basically a lightweight ESB. So I used to work, in earlier days, in more heavy duty stuff like WebSphere and TIBCO and stuff like that. MuleSoft kind of peels back the layer. And it's a lot more simpler product. It's called a lightweight ESB, which basically still supports the messaging architecture. But it's basically a lot simpler than some of the heavy duty ESBs that used to be around like 10 years back.

      So what I've done here is, basically, built a Forge connector that can, basically, tie Forge into the message bus so that you can, basically, leverage Forge across your enterprise. So if you look at the connector ecosystem for MuleSoft, so they have connectors for pretty much any leading enterprise system that you can think of. And so once we get a Forge connector into a platform like MuleSoft, it's fairly easy for us to integrate with any of the applications that are available in its ecosystem.

      So let's briefly talk about patterns. So one of the core things about Mule that kind of makes it simple is that they kind of adopted, almost religiously, the patterns that were described in this book. So this book came out 10-plus years back, maybe even longer.

      But this is still considered one of the seminal works in integrations. And lot of the patterns here, you see them all around. And MuleSoft kind of took an approach where they, basically, used their component names, essentially, follow the naming conventions used in these patterns.

      So patterns are nothing more than reusable integration or design solutions that you can either use independently or with other patterns to solve integration problems. So when you start looking at a integration problem, you can kind of break it down into different patterns. OK, this is an aggregator or this is a splitter. And then you kind of proceed in that manner. And the way the MuleSoft components are structured, essentially, it facilitates using a pattern-based approach for integrations.

      As I mentioned, many of the components in MuleSoft, essentially, use the same names that you find in the enterprise pattern. So once you study enterprise patterns, it's almost fairly easy to learn MuleSoft. So it's kind of like a common language that they adopted for integrations.

      All right, so next we'll talk about the Forge connector. So the MuleSoft environment essentially comprises of a development environment and a runtime environment. So the development environment is essentially an Eclipse-based studio, which they built a wrapper on top of that, allowing you to build the visual flows. So you can basically build visual flows.

      And then from there, you can either deploy it to the CloudHub, which is basically their cloud-hosted integration environment or you can also host it on premise. I mean, they have an on-premise solution as well if you want to run your integration on premise. And there is even a community edition that can be run on premise, which is open source and free, which is a nice thing. So if you don't want to go for the expensive solution, you can use the community edition, which is which is free.

      So once you deploy the Forge connector into the platform, it basically shows up in the palette of Mule as a connector. And then you can just use that in any of your flows. With Forge, we basically provide a configuration option. So once you add the connector, and then show you in the demo, basically, we have to specify the client ID and the client secret of the Forge application that we are going to be using for that.

      And then, the connector basically defines as many operations as you need. You basically reference those operations in the flow. And then the operation basically defines what the inputs and outputs are. All of this can be done kind of in a visual manner.

      All right, so with that, any questions so far? OK, so with that, I'm getting to my demo. The first demo is, essentially, somebody creates an item in Salesforce. And then we have a CAD model that's updated in a file folder. The Mule workflow essentially correlates these two, uses the Forge connector, performs the translation, and then sends the translated model to Salesforce. We have a LMV viewer embedded in Salesforce that you can use to view that model.

      And then, at the same time, we send a notification in Slack that the model is ready. And the link is sent in there. So this kind of highlights a simple scenario. But I want to show that. And we can take a look at different things that you can do with the flow from there.

      So this is basically the flow. So if you look at it, this is basically pulling for a file directory. And then it's basically ensuring that you don't process any file more than once. And this is, again, a pattern, called Idempotent Receiver, which essentially ensures that you don't process the same message more than once. And then, you're doing a translation, calling Forge connector to perform translation for the LMV model and then, again, calling Salesforce to update that LMV link and finally updating Slack.

      So let me just run the demo. And then we will take a look at the flow. And I can show you how this is structured. So in Salesforce, so let me just create a new product.

      OK, so this is the LMV model. But at this point, I don't have a viewable yet. So, let's say, I am not going to use the CAD system, at this point. I'm just going to copy an existing model.

      So it's basically matching it on the name. So basically, in a runtime environment, your integration would be running all the time. As I mentioned earlier, it would be running on CloudHub or on your hosted system. In this case, I'm just going to run it here. And, basically, the development environment has its own container to run it.

      So if you run it here, it's got a web container built in. It'll run it locally on the Eclipse environment. And it'll start pulling the file.

      All right, so it's running. So this is nothing more than a Java application that's running in a web container. So you see that it's picked up the file. And it's sending it to the translator.

      And, obviously, right now it's still pulling for it. But it's done. And it has updated the Salesforce link. So if I go back to Salesforce now and do a refresh, you'll see that the link is there and the model has made it to Salesforce.

      So just a simple scenario and then I think we also had a link to send a link to Slack. So you see this message in Slack that shows up. This is, basically, a simple scenario. But let's say you want to change Slack to Twitter or something like that or something else. It's really just a matter of just finding the connector and adding it in there.

      So if you basically find the appropriate connector for that tool. And you can just drag and drop it into the environment. And now, you basically have the option to connect with another enterprise application like Twitter or something like that. So that's basically the power of this, of a framework like this.

      I'm calling an operation, create LMV model, which encapsulates all the stuff that needs to happen to translate a model into a lightweight, I mean, large-model viewer link. And that operation performs everything. And all I need to worry about are the inputs and outputs to that.

      I have a transformer before that where I'm passing it the bucket key, the file name, and the file part. And I have an output where I'm getting the translated stuff back from that translation, which then I'm then passing to Salesforce to establish the link. Any questions on this flow so far? Go ahead.

      AUDIENCE: So basically you have to put a folder [INAUDIBLE]?

      PRESENTER: Yeah. It's [INAUDIBLE].

      AUDIENCE: So you're just kind of watching it?

      PRESENTER: Yeah, yeah.

      AUDIENCE: If something happens with this, then it gets transferred there. And when you tied it together with the file, so you made a product with a certain name [INAUDIBLE].

      PRESENTER: Yeah, yeah. You're matching. Yeah, yeah. Yeah.

      AUDIENCE: When you send it out to Forge, typically what Forge passes back after you translate it [INAUDIBLE].

      PRESENTER: It's URN.

      AUDIENCE: OK. So you got the URN back. And that's what you used [INAUDIBLE].

      PRESENTER: I sent the URN to Salesforce. If you look at Salesforce, see this is the URN. I mean, you would typically hide this, I mean, in your implementation. But that's basically what I'm passing back to Salesforce. And then it's using that. You need to authenticate in Salesforce. I'm using two-legged oauth in Salesforce to get the token.

      AUDIENCE: Is that the [INAUDIBLE] or is that [INAUDIBLE]?

      PRESENTER: Which URN? I'm sorry.

      AUDIENCE: [INAUDIBLE]

      PRESENTER: This is outer desk document URN that you need for the large-model viewer. So this basically tells the LMV where to get that file.

      AUDIENCE: So the large part of the viewer is built into Salesforce then?

      PRESENTER: I added it.

      AUDIENCE: Oh, you added it.

      PRESENTER: Yeah, so it's basically an iframe. And I embedded that iframe into Salesforce and had some scripting in there to get the token. It needs to authenticate, as well. So it's getting a token to use the viewer every time I'm using that.

      So again, I think the real power here is, I mean, once you have a connector on the operation, you can use it for a number of things. It's not just for a particular thing. For example, if I want to change Salesforce to NetSuite now, all I have to do is change that connector.

      All the pieces of stuff I've done up to that point are still good. All I need to do is change Salesforce connector to NetSuite. And then, it still works. Any other questions on this flow?

      All right, so we talked about patterns So we just looked at this. So what are the patterns that what we saw here? So just to recap on some of the stuff, so what you do here, I mean, this is obvious, but it's actually there's a pattern for it. It's called polling consumer.

      So that's basically the pattern we're using here, where it's basically pulling your file directory to see if there is a file. So the next pattern we saw was Idempotent Receiver. So essentially, let's say you have a file folder. And it's looking at that file all the time. You don't want to process the same file multiple times.

      So this component, essentially, what it does is it allows you to define an ID, a correlation ID, or a message ID, which you can use to filter out messages that are already processed. So in this case, what I did was I used a combination of the filename and the timestamp. So as long as the filename and the timestamp don't change, I don't process it again.

      If I go and update that same file now, it'll process it again and send a new model to Salesforce.

      AUDIENCE: [INAUDIBLE]

      PRESENTER: Yeah, you can update it and put a new file in there.

      AUDIENCE: [INAUDIBLE] have the same name [INAUDIBLE].

      PRESENTER: You can have the same name. But it's looking at the combination of filename and the timestamp. So as long as that timestamp changes, it'll read it as a new message, yeah.

      So the other pattern you saw here, which is common across the board, is a message translator, which is a common pattern that you see whenever you need to translate a message from one app to another. And then, this is basic. Again, a lot of these patterns are overlapping. And this whole flow is called a Composed Message Processor.

      So, essentially, you have a whole series of steps that are happening. And then, if any one of them fail, you have an exception strategy on what to do. In my case, I just have a Slack message. If something failed, it'll just force something in Slack, saying that, OK, this publishing failed. You have to go take a look at it or something like that.

      All right, so any other questions on this before I go to the next demo? And again, the reason I'm showing these demos is just to highlight the point that with the connector your options are unlimited. And really, a flow like this, once you have a connector with the operations, you can set something like this in 10 or 15 minutes, literally. I mean, it's basically drag and drop. I mean, I know some people probably could write code as fast as that. But, for most people, I think this is still a convenient way to do stuff.

      All right, so the next one is similar. Here, what I'm doing is, again, starting in Salesforce. I'm creating a project in Salesforce. I'm taking a long route here. This, you don't have to do it. But I wanted to use a queuing system to show it.

      So, essentially, I'm using this app to transform an outbound Salesforce message to a Amazon queue, SQS message. In MuleSoft, we basically have listening for the SQS message. And then the Forge connector updates BIM 360 with the project.

      All right, so let me go back to the demo. All right, so again, it's the same thing that we looked at in the-- So you have the queuing system. Again, this message gets here from Salesforce through an outgoing message through Zapier. And then I'm doing some byte transformation stuff because it's Base64 encoded.

      So I take care of the decoding there. And then, I basically update BIM 360 with that information. Yeah, actually I called Salesforce because I don't get the entire data from the message. All I'm getting is the ID.

      So I make a call back into Salesforce to pull all the data. And then I call Forge to update BIM 360.

      So going back to Salesforce, so I just created, like, a simple object, again, in Salesforce. So let's, again, call it Forge. OK, so it basically sends it only if you select that flag. And some of those are things that you can easily tailor.

      So the good thing about queues is, I can basically post this message and it'll update the queue. My application doesn't need to be running. And if I start it later, it'll pick up the queue. So if you're trying to do something real time, like a HTTP, you need to have your system up and running. Otherwise, it'll fail.

      But since we're using a queuing system, I don't need to have my app up and running. I can just post this, the message is already waiting in Amazon SQS. And when I come back here, and I start this, so it's picked up the message and processed it. So if I go into BIM 360 now, all right, so we see how our project upgraded there.

      So again, simplest case, it just highlights the point. But let's say I want to include Slack here. Again, all I have to do is find the Slack connector and put it in there. So any questions on this flow on?

      So one other thing I want to highlight here is, essentially, when you build the connector, MuleSoft is, what they call, DataSense-enabled. So it can read your POJOs, your Plain Old Java Object. And it can, basically, find out what other fields it's looking for. And it can give you a drag and drop interface to do mapping stuff.

      So if you go and look at the transform, so I just mapped five fields. But, essentially, once you have a connector like this, this kind of mapping can be done by somebody who's not a developer. So this kind of expands the number of people who can use Forge to do integrations as well.

      So you can basically build a connector that supports a drag and drop transformer like this. And all the user needs to do is map the source connector to the target connector. And then the transformer automatically infers what are the outputs, what are the inputs. You just need to drag and drop.

      Obviously, if you need to make some additional changes to the transformation, you have to do some of the additional stuff. I mean, there are some syntax that needs to be learned. But for simple mapping, really, it's just really dragging from the source to the target, nothing more than that. So any questions on this flow or anything on the MuleSoft side?

      AUDIENCE: So [INAUDIBLE] you basically set up all these integrations in some sort of, like, project. And that whole thing is running in MuleSoft [INAUDIBLE] and it's just sitting there, like, some integrations you set up on, like, timers and things every 10 minutes. They check something or they pull something. Now those are--

      PRESENTER: Event driven, yeah.

      AUDIENCE: [INAUDIBLE] And then, what is that [INAUDIBLE] in thinking in terms of, like, having regular [INAUDIBLE] web services, in terms of having this, like, whole [INAUDIBLE] integration. If I wanted to access MuleSoft in some other system, how would I do that? [INAUDIBLE] API [INAUDIBLE]?

      PRESENTER: No, in this case I did it through a queue, Amazon SQS. But you can also do it. MuleSoft also has an API. So they have it as kind of API, which can trigger this as well. So the primary integration mechanism is event based, where you have events. And you said WebHook could be one.

      So, in this case, we are pushing from the Salesforce to BIM 360. With WebHooks, now, we can push it back from BIM 360 back into Salesforce. So I can register a WebHook here. And then once somebody gets the project in BIM 360, I can update back into Salesforce.

      But, yeah, they do have a REST API that you can define. In fact, they have a pretty rich API platform that you can define your own APIs that you can use to trigger different things. I mean, you have Forge APIs already. And then you have APIs on top of it. But it kind of gives you the ability to define like composite APIs.

      Let's say you want to do one big transaction. And you want a single API. You can basically do that. And so STTP could be another trigger, SQSS one polling file, which was the other one we did, or it could be batch, like you said. I mean, it could be scheduled based on time, run it on a budget.

      AUDIENCE: [INAUDIBLE]

      PRESENTER: Yeah, you're right. All right, so let's take a look at one of the patterns we saw here. So this is, I mean, it's basically a message channel. So we are using a queuing system. It's the Amazon SQS, in this case. But it's basically a message channel that's what they call in the pattern.

      And this pattern is called a claim check. And, essentially, it's like when Salesforce posted the information, it did not send all the data at the time. It just sent the ID of the document from Salesforce to MuleSoft. And MuleSoft is basically querying Salesforce to get all the data. So this pattern is called claim check, where you don't send all the data immediately. And you just send a reference and then use that to pull it back.

      Channel Adapter, again, a lot of these patterns are repeating I mean, any kind of adapter that you build to integrate with the bus is called a Channel Adapter pattern. And then, this, again, is an overarching pattern that you'll see everywhere, pipes and filters. Essentially, you have one, each layer, making changes to a message as it flows through the integration. Any questions on this flow?

      All right, so with that, let's get into how you build the connector. So building the connector actually is fairly simple, which is actually a nice thing about MuleSoft. Really, all we have to do is create a wrapper with their annotations in Java, which will link the client classes, which are the Forge APIs classes through MuleSoft. And that's exactly what I have done in my stuff.

      And these wrappers are fairly simple. Each operation that I defined is basically a single method, which I need to wrap with the processor annotation. It'll get recognized in MuleSoft flow. And they have a toolkit called DevKit, which you can use to build these connectors.

      And then, as I said, these connectors are DataSense enabled, in which if you define your static data models within the connector, the system will recognize that. And when you visually bring up the source and target, it will basically recognize what's the data I'm getting, what's the data I need to update. And then, you can use a drag and drop interface to do the mapping.

      All right, so let me show you in the studio, so this is the connector. So I'm getting the token. And you see this is the, for example, this is the method that we're using for creating the LMV model.

      And all it's doing is it's basically using the API kit. So if you guys have seen the Java API kit in GitHub, you can basically use that. And it just needs to go around this wrapper. So this wrapper, this annotation, ensures that MuleSoft recognizes this as an operation.

      And then, you can just use the standard Forge API kit to make your calls to perform the different operations. Have you guys used the API kit? Like, anybody use the Java stuff or no?

      AUDIENCE: [INAUDIBLE] use it, like, direct calls [INAUDIBLE].

      PRESENTER: OK. So actually, initially, I had done it through other stuff too. But now they have a standardized API kit because, if you look at it, I think they have different languages there. And it's kind of uses the same pattern.

      So this code, essentially, uses the API kit. So I really didn't have to do much code at all. I mean, I kind of had to copy some of the classes here. But really I didn't even have to do that. I could have just referenced the job completely and just called the matters externally to the API kit.

      And all we really need here is just to get the wrapper around here and that should have taken care of. Any questions on steps to build the connector?

      AUDIENCE: What were your variables defined in here?

      PRESENTER: OK, so this is the connector class. And there is another class called Config. These are the inputs where you define where the client ID is. So if I go back to MuleSoft, I look at the connector. So this is where I configure this. And that is exposed in the Config.

      So they have classes when you create a project. So after you set up MuleSoft, if you say new, any point connector project, this will, basically, create you a connector project, which has all the wrapper stuff that you need.

      All you have to do is add your methods there add any other Config stuff you need. It kind of gives you a base template for building a connector. You can just go ahead and add your stuff on top of it.

      So these are the inputs. So, for example, see this is where I defined the payload. So I call this payload. And I'm basically using a POJO here. And, as I said, MuleSoft recognizes the Java object. And it knows what it needs.

      You can define whatever you want in that class. So that's your input. And then this is your output. So that's basically what you define. And when you define something as a processor, it knows that it's an operation that it needs to expose. So that's basically it. Any questions on the connector?

      So, essentially, one of the real strengths of MuleSoft is that it uses what we call as a set architecture or a staged-event driven architecture, which inherently gives you more throughput than, like, a standard serial or any kind of threading that you do because this uses a queuing approach, where each stage can be processed thoroughly. So when you're talking about cloud infrastructures and stuff like that, using this kind of model allows you to scale easily across multiple-clustered environments and stuff like that and give you a much higher throughput than a standard Java program array or standard serial-based integrations would give you.

      So the runtime can be either on premise or use the CloudHub. The community edition is free, I mean, if you want to try that. And you can pretty much do most of the stuff we talked about in the community edition.

      And I think we talked about the REST API. They actually have a fairly extensive API framework using a RAML language, which allows you to construct the REST API, which you can use as a trigger to invoke different stuff. So, again, MuleSoft is not the only one. I mean, in fact, some of these are actually even closer to the enterprise integration patterns.

      And if you like open source stuff, these are completely free. The only thing is they don't have a nice virtual editor like MuleSoft does. So you'll have to do a lot of the configuration directly in XML or directly in code. But these two are fairly excellent frameworks to use as well, which leverage the enterprise integration patterns.

      All right, so here's something interesting I saw. Here's one of the authors of the original Enterprise Integration Patterns book. So he was apparently visiting a Starbucks, and all he can think of is enterprise integration patterns. So he's starting seeing enterprise patterns everywhere.

      So you look at Starbucks, they have their own naming convention for stuff. And this is equivalent to economical data model, essentially. You have your own standard for naming data. When you get a drink, they basically put your name in there. And this is, basically, done for correlating your cup, in case there's a problem with your drink back to your order. And this is, basically, a correlation identifier, which is a pattern in the enterprise integrations.

      Stores like this are set up for an asynchronous processing, so that they can get high throughput. So you got multiple baristas working on your drink. Depending on when your order comes, they're all competing to pick up an order and process it. And this is, again, a pattern. It's called competing consumers.

      So here are some of the key takeaways. Essentially, I think if you take a pattern-based approach, you can greatly simplify enterprise integration. And by mapping Forge capabilities into a connector that can tie into an ESB, you can basically even engage non-developers in your company to develop integrations. So the event-based, as we said, the set architecture, this inherently supports a higher scale processing and gives you a lot higher throughput for your inputs. Any questions?

      AUDIENCE: So the community edition, those are [INAUDIBLE] play around with. [INAUDIBLE] community edition, free the IT department in sort of [INAUDIBLE] got to be on the server connected to [INAUDIBLE]? How does that work?

      PRESENTER: So, again, as I said, if you're, obviously, trying to open firewall ports and stuff like that, you've got to engage IT. But the community edition itself is just a Java app that can run on, like, a JEDI container or a Tomcat. It basically is a Java application that runs.

      And you can run it. And you can do flows like queuing-based workflows and stuff like that. And also, the studio has its own container.

      If you just want to get started, go ahead and download the studio. The studio has its own container. And you can start. Like, all the stuff I did right now was completely on the studio. I didn't even have a server.

      So you can just do everything on the studio. And then, when you're ready to use, then, only then, you need to worry about whether you need the community edition or you want to use their CloudHub, which is their integration as a service offering. Any other questions, guys?

      AUDIENCE: So [INAUDIBLE]?

      PRESENTER: So Java's the primary platform that MuleSoft is built on. And, as I said, the connectors, you need to write it in Java. But MuleSoft itself has scripting capabilities. I mean, you can do scripting in JavaScript and all that stuff. So within MuleSoft, you can do scripting in other languages, like JavaScript or Groovy and stuff like that. But the core platform is, basically, built on Java and Spring Framework, basically.

      All right. Anything else? OK, thank you, all. Thanks for coming, guys.

      [APPLAUSE]

      ______
      icon-svg-close-thick

      Cookie preferences

      Your privacy is important to us and so is an optimal experience. To help us customize information and build applications, we collect data about your use of this site.

      May we collect and use your data?

      Learn more about the Third Party Services we use and our Privacy Statement.

      Strictly necessary – required for our site to work and to provide services to you

      These cookies allow us to record your preferences or login information, respond to your requests or fulfill items in your shopping cart.

      Improve your experience – allows us to show you what is relevant to you

      These cookies enable us to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we use to deliver information and experiences tailored to you. If you do not allow these cookies, some or all of these services may not be available for you.

      Customize your advertising – permits us to offer targeted advertising to you

      These cookies collect data about you based on your activities and interests in order to show you relevant ads and to track effectiveness. By collecting this data, the ads you see will be more tailored to your interests. If you do not allow these cookies, you will experience less targeted advertising.

      icon-svg-close-thick

      THIRD PARTY SERVICES

      Learn more about the Third-Party Services we use in each category, and how we use the data we collect from you online.

      icon-svg-hide-thick

      icon-svg-show-thick

      Strictly necessary – required for our site to work and to provide services to you

      Qualtrics
      We use Qualtrics to let you give us feedback via surveys or online forms. You may be randomly selected to participate in a survey, or you can actively decide to give us feedback. We collect data to better understand what actions you took before filling out a survey. This helps us troubleshoot issues you may have experienced. Qualtrics Privacy Policy
      Akamai mPulse
      We use Akamai mPulse to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Akamai mPulse Privacy Policy
      Digital River
      We use Digital River to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Digital River Privacy Policy
      Dynatrace
      We use Dynatrace to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Dynatrace Privacy Policy
      Khoros
      We use Khoros to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Khoros Privacy Policy
      Launch Darkly
      We use Launch Darkly to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Launch Darkly Privacy Policy
      New Relic
      We use New Relic to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. New Relic Privacy Policy
      Salesforce Live Agent
      We use Salesforce Live Agent to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Salesforce Live Agent Privacy Policy
      Wistia
      We use Wistia to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Wistia Privacy Policy
      Tealium
      We use Tealium to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Tealium Privacy Policy
      Upsellit
      We use Upsellit to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Upsellit Privacy Policy
      CJ Affiliates
      We use CJ Affiliates to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. CJ Affiliates Privacy Policy
      Commission Factory
      We use Commission Factory to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Commission Factory Privacy Policy
      Google Analytics (Strictly Necessary)
      We use Google Analytics (Strictly Necessary) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Strictly Necessary) Privacy Policy
      Typepad Stats
      We use Typepad Stats to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. Typepad Stats Privacy Policy
      Geo Targetly
      We use Geo Targetly to direct website visitors to the most appropriate web page and/or serve tailored content based on their location. Geo Targetly uses the IP address of a website visitor to determine the approximate location of the visitor’s device. This helps ensure that the visitor views content in their (most likely) local language.Geo Targetly Privacy Policy
      SpeedCurve
      We use SpeedCurve to monitor and measure the performance of your website experience by measuring web page load times as well as the responsiveness of subsequent elements such as images, scripts, and text.SpeedCurve Privacy Policy
      Qualified
      Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

      icon-svg-hide-thick

      icon-svg-show-thick

      Improve your experience – allows us to show you what is relevant to you

      Google Optimize
      We use Google Optimize to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Google Optimize Privacy Policy
      ClickTale
      We use ClickTale to better understand where you may encounter difficulties with our sites. We use session recording to help us see how you interact with our sites, including any elements on our pages. Your Personally Identifiable Information is masked and is not collected. ClickTale Privacy Policy
      OneSignal
      We use OneSignal to deploy digital advertising on sites supported by OneSignal. Ads are based on both OneSignal data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that OneSignal has collected from you. We use the data that we provide to OneSignal to better customize your digital advertising experience and present you with more relevant ads. OneSignal Privacy Policy
      Optimizely
      We use Optimizely to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Optimizely Privacy Policy
      Amplitude
      We use Amplitude to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Amplitude Privacy Policy
      Snowplow
      We use Snowplow to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Snowplow Privacy Policy
      UserVoice
      We use UserVoice to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. UserVoice Privacy Policy
      Clearbit
      Clearbit allows real-time data enrichment to provide a personalized and relevant experience to our customers. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID.Clearbit Privacy Policy
      YouTube
      YouTube is a video sharing platform which allows users to view and share embedded videos on our websites. YouTube provides viewership metrics on video performance. YouTube Privacy Policy

      icon-svg-hide-thick

      icon-svg-show-thick

      Customize your advertising – permits us to offer targeted advertising to you

      Adobe Analytics
      We use Adobe Analytics to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Adobe Analytics Privacy Policy
      Google Analytics (Web Analytics)
      We use Google Analytics (Web Analytics) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Web Analytics) Privacy Policy
      AdWords
      We use AdWords to deploy digital advertising on sites supported by AdWords. Ads are based on both AdWords data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AdWords has collected from you. We use the data that we provide to AdWords to better customize your digital advertising experience and present you with more relevant ads. AdWords Privacy Policy
      Marketo
      We use Marketo to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. We may combine this data with data collected from other sources to offer you improved sales or customer service experiences, as well as more relevant content based on advanced analytics processing. Marketo Privacy Policy
      Doubleclick
      We use Doubleclick to deploy digital advertising on sites supported by Doubleclick. Ads are based on both Doubleclick data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Doubleclick has collected from you. We use the data that we provide to Doubleclick to better customize your digital advertising experience and present you with more relevant ads. Doubleclick Privacy Policy
      HubSpot
      We use HubSpot to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. HubSpot Privacy Policy
      Twitter
      We use Twitter to deploy digital advertising on sites supported by Twitter. Ads are based on both Twitter data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Twitter has collected from you. We use the data that we provide to Twitter to better customize your digital advertising experience and present you with more relevant ads. Twitter Privacy Policy
      Facebook
      We use Facebook to deploy digital advertising on sites supported by Facebook. Ads are based on both Facebook data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Facebook has collected from you. We use the data that we provide to Facebook to better customize your digital advertising experience and present you with more relevant ads. Facebook Privacy Policy
      LinkedIn
      We use LinkedIn to deploy digital advertising on sites supported by LinkedIn. Ads are based on both LinkedIn data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that LinkedIn has collected from you. We use the data that we provide to LinkedIn to better customize your digital advertising experience and present you with more relevant ads. LinkedIn Privacy Policy
      Yahoo! Japan
      We use Yahoo! Japan to deploy digital advertising on sites supported by Yahoo! Japan. Ads are based on both Yahoo! Japan data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Yahoo! Japan has collected from you. We use the data that we provide to Yahoo! Japan to better customize your digital advertising experience and present you with more relevant ads. Yahoo! Japan Privacy Policy
      Naver
      We use Naver to deploy digital advertising on sites supported by Naver. Ads are based on both Naver data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Naver has collected from you. We use the data that we provide to Naver to better customize your digital advertising experience and present you with more relevant ads. Naver Privacy Policy
      Quantcast
      We use Quantcast to deploy digital advertising on sites supported by Quantcast. Ads are based on both Quantcast data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Quantcast has collected from you. We use the data that we provide to Quantcast to better customize your digital advertising experience and present you with more relevant ads. Quantcast Privacy Policy
      Call Tracking
      We use Call Tracking to provide customized phone numbers for our campaigns. This gives you faster access to our agents and helps us more accurately evaluate our performance. We may collect data about your behavior on our sites based on the phone number provided. Call Tracking Privacy Policy
      Wunderkind
      We use Wunderkind to deploy digital advertising on sites supported by Wunderkind. Ads are based on both Wunderkind data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Wunderkind has collected from you. We use the data that we provide to Wunderkind to better customize your digital advertising experience and present you with more relevant ads. Wunderkind Privacy Policy
      ADC Media
      We use ADC Media to deploy digital advertising on sites supported by ADC Media. Ads are based on both ADC Media data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that ADC Media has collected from you. We use the data that we provide to ADC Media to better customize your digital advertising experience and present you with more relevant ads. ADC Media Privacy Policy
      AgrantSEM
      We use AgrantSEM to deploy digital advertising on sites supported by AgrantSEM. Ads are based on both AgrantSEM data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AgrantSEM has collected from you. We use the data that we provide to AgrantSEM to better customize your digital advertising experience and present you with more relevant ads. AgrantSEM Privacy Policy
      Bidtellect
      We use Bidtellect to deploy digital advertising on sites supported by Bidtellect. Ads are based on both Bidtellect data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bidtellect has collected from you. We use the data that we provide to Bidtellect to better customize your digital advertising experience and present you with more relevant ads. Bidtellect Privacy Policy
      Bing
      We use Bing to deploy digital advertising on sites supported by Bing. Ads are based on both Bing data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bing has collected from you. We use the data that we provide to Bing to better customize your digital advertising experience and present you with more relevant ads. Bing Privacy Policy
      G2Crowd
      We use G2Crowd to deploy digital advertising on sites supported by G2Crowd. Ads are based on both G2Crowd data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that G2Crowd has collected from you. We use the data that we provide to G2Crowd to better customize your digital advertising experience and present you with more relevant ads. G2Crowd Privacy Policy
      NMPI Display
      We use NMPI Display to deploy digital advertising on sites supported by NMPI Display. Ads are based on both NMPI Display data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that NMPI Display has collected from you. We use the data that we provide to NMPI Display to better customize your digital advertising experience and present you with more relevant ads. NMPI Display Privacy Policy
      VK
      We use VK to deploy digital advertising on sites supported by VK. Ads are based on both VK data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that VK has collected from you. We use the data that we provide to VK to better customize your digital advertising experience and present you with more relevant ads. VK Privacy Policy
      Adobe Target
      We use Adobe Target to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Adobe Target Privacy Policy
      Google Analytics (Advertising)
      We use Google Analytics (Advertising) to deploy digital advertising on sites supported by Google Analytics (Advertising). Ads are based on both Google Analytics (Advertising) data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Google Analytics (Advertising) has collected from you. We use the data that we provide to Google Analytics (Advertising) to better customize your digital advertising experience and present you with more relevant ads. Google Analytics (Advertising) Privacy Policy
      Trendkite
      We use Trendkite to deploy digital advertising on sites supported by Trendkite. Ads are based on both Trendkite data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Trendkite has collected from you. We use the data that we provide to Trendkite to better customize your digital advertising experience and present you with more relevant ads. Trendkite Privacy Policy
      Hotjar
      We use Hotjar to deploy digital advertising on sites supported by Hotjar. Ads are based on both Hotjar data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Hotjar has collected from you. We use the data that we provide to Hotjar to better customize your digital advertising experience and present you with more relevant ads. Hotjar Privacy Policy
      6 Sense
      We use 6 Sense to deploy digital advertising on sites supported by 6 Sense. Ads are based on both 6 Sense data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that 6 Sense has collected from you. We use the data that we provide to 6 Sense to better customize your digital advertising experience and present you with more relevant ads. 6 Sense Privacy Policy
      Terminus
      We use Terminus to deploy digital advertising on sites supported by Terminus. Ads are based on both Terminus data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Terminus has collected from you. We use the data that we provide to Terminus to better customize your digital advertising experience and present you with more relevant ads. Terminus Privacy Policy
      StackAdapt
      We use StackAdapt to deploy digital advertising on sites supported by StackAdapt. Ads are based on both StackAdapt data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that StackAdapt has collected from you. We use the data that we provide to StackAdapt to better customize your digital advertising experience and present you with more relevant ads. StackAdapt Privacy Policy
      The Trade Desk
      We use The Trade Desk to deploy digital advertising on sites supported by The Trade Desk. Ads are based on both The Trade Desk data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that The Trade Desk has collected from you. We use the data that we provide to The Trade Desk to better customize your digital advertising experience and present you with more relevant ads. The Trade Desk Privacy Policy
      RollWorks
      We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

      Are you sure you want a less customized experience?

      We can access your data only if you select "yes" for the categories on the previous screen. This lets us tailor our marketing so that it's more relevant for you. You can change your settings at any time by visiting our privacy statement

      Your experience. Your choice.

      We care about your privacy. The data we collect helps us understand how you use our products, what information you might be interested in, and what we can improve to make your engagement with Autodesk more rewarding.

      May we collect and use your data to tailor your experience?

      Explore the benefits of a customized experience by managing your privacy settings for this site or visit our Privacy Statement to learn more about your options.