AU Class
AU Class
class - AU

Vault Shining on Clouded Days

共享此课程
在视频、演示文稿幻灯片和讲义中搜索关键字:

说明

Maintaining high levels of data security while ensuring optimal performance can be a challenging task for us. In early 2022, we decided to move one of our Vault Professional servers to Amazon Web Services (AWS) and it proved to be a successful transition. As a result, we plan to migrate the rest of our Vault Professional servers to AWS in 2023. During this session, we will explore how we use AWS to enable global access to our Vault Professional environment, streamline design workflows, and provide relevant data to the appropriate individuals in a timely manner.

主要学习内容

  • Evaluate Vault software's performance in AWS.
  • Discover data security concerns.
  • Explore AWS efficiencies versus on-premise solutions.
  • Learn about implementing a migration strategy.

讲师

  • Joshua Wilson 的头像
    Joshua Wilson
    I am Josh Wilson, the Fusion 360 Manage Administrator at Bridgestone Americas. With a focus on data management and flow within the Autodesk Manufacturing industry, I have specialized in this field since 2011. My expertise lies in utilizing the Autodesk Vault vertical to ensure efficient data handling. Throughout my career, I have effectively implemented Vault at multiple companies, dealing with diverse levels of complexity. My primary objectives revolve around optimizing data flow, starting from the initial conception stage and extending all the way to the manufacturing and maintenance handoff. To achieve this, I rely on the powerful combination of Fusion 360 Manage and Autodesk Vault Professional to streamline the entire process.
  • Carlos Caminos
    Carlos Caminos is a seasoned BIM Professional and the Manager of the Asset Data Management Team at Bridgestone Americas. In his role, he plays a crucial part in coordinating and optimizing the flow of data from the design and engineering stages all the way through to manufacturing. Carlos is responsible for implementing software solutions, providing training, and establishing efficient workflows within the organization. With an impressive track record spanning over 25 years, Carlos has extensive experience in utilizing Autodesk software. His proficiency extends across a range of tools, including AEC Collections, Product Design & Manufacturing Collection, Vault Professional, and Autodesk Construction Cloud software. Carlos's expertise encompasses practical applications of Autodesk, Inc. products within the architecture, engineering, construction, and manufacturing industries.
Video Player is loading.
Current Time 0:00
Duration 32:40
Loaded: 0.51%
Stream Type LIVE
Remaining Time 32:40
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

JOSH WILSON: Hi. Welcome to "Vault Shining on Cloudy Days." My name is Josh Wilson. I am the Fusion 360 manager administrator at Bridgestone Americas. My career has been spent doing data integration and data process flow within the manufacturing industry for the last 12-plus years.

CARLOS CAMINOS: And my name is Carlos Caminos. I'm manager of Vault and asset management at Bridgestone, specifically for ESS, which is Engineering Support Services. I have over 10 years experience with BIM technology in the plant design, mechanical engineering, and even construction area.

JOSH WILSON: Our learning objectives for today are understand Vault's performance in AWS, understand data security concerns that we had with AWS and on-prem servers, understanding AWS efficiencies versus our on-prem solutions, and learn about our implementation and migration strategy.

So to give a little background, we'll go into how we had things set up before. So previous to 2022, we did have a total of six servers, six different Vault servers, each with their own instance of the ADMS, Autodesk Data Management Server piece, on a local VM hosted in our data center.

This totaled about 6 terabytes of storage between all six servers. And on top of that, we had a total of 10 on-prem VMs that hosted different AVFS, Autodesk Vault File Server solutions. This was to help speed up replication and data flow between some of our sites that were a little bit further away from our data center.

Now, with these AVFSs, we had nine of them that connected directly to our first main production server. This main server was-- it's the largest. It's the one that everybody in the entire industry in the United States, and North and South America, connect to. And then we had one that went to a different sub-- or a different ADMS server for a different business unit.

Now, one thing to note is each one of these Vault servers is dedicated to a different business unit. So we can shift things around a little bit better and manage these different data solutions for each division a little bit easier.

So next, we're going to get into some of the IT security concerns that have come up over the last 5, 6 years.

CARLOS CAMINOS: With IT concerns nowadays and ever-changing platforms and landscapes, the world of IT security threats, there were things IT wanted to address. They wanted to move away from on-prem servers. They wanted server access and security to be better defined, data encryption, disaster recovery, planning, of course. And these things were important to us as well, being the data management team.

So let me give you a couple scenarios. What if things went wrong in your company? And let's say your snapshots were not running correctly.

Let's say your tape backup recovery was not available. They weren't functioning right. They weren't checked.

ADMS backup, full or incremental, was not running correctly because that would wreck your restore. The Vault backups were lost or misplaced for some reason. SQL databases were corrupt.

Let's read between the lines here. These are all things that could happen at some point. And if you don't think about it ahead of time, if you don't have it planned out, these could all be real catastrophes for somebody. That's why it's important to have a DR plan. And you should test your DR plan.

Failing to do so-- I don't know how many people test their DR plan. We have in place now that we recover one of our data sets quarterly. It wasn't always the case. I know, in some cases, it's not even common practice. But you should have a test plan and be ready to execute it and test it so often.

JOSH WILSON: So that leads us into, what did we do? After looking at some of these IT security concerns, we, as data management administrators within Bridgestone, we needed to look at solutions, look at what we could do to help protect the company and help protect the company's IP, Intellectual Property.

So we started to evaluate our options. We looked at what we had in place currently with Vault Professional on our local VMs. We looked at our current antivirus solutions and what we all had in place. That being said, we did have a little bit of a company directive, a company push, to transition some of our data to AWS to alleviate some old and retired hardware that we had in our data center.

With that, we did decide, let's do a migration with one of our Vaults just to test it out and see how it goes in a production environment. To do that, we chose our largest Vault, mainly because that's where we had the most access to our data. Everybody within North America, entire plants, access this engineering and design documentation located in Vault.

Now, with that, we had to come up with a plan. We had to develop a data migration plan. What were we going to do? How were we going to do it? We did have to have some communications with our internal IT on our IT requirements for AWS.

We also had to have a communications with our AWS implementation team. Within that communication, we came up with a rough timeline. IT and us, we kind of came up with two months is what it was going to take us to initially get that data transitioned over there, which gave us plenty of time for testing.

Now, our AWS team did create a dev environment for us, a development and scenario, so we could get in there before we did our migration to production, so that we could test our solution and make sure it was going to be viable from our end before we transitioned an entire company over of 2-plus terabytes of data in Vault to AWS.

And then we got to the execution phase. With this, we had some great communications. We had some great help along the way. But our IT team and our AWS team built up our production environment for us. We were able to install ADMS and SQL databases.

We have two different servers for this environment, one hosting the product and the other one hosting SQL. We were able to restore everything properly with communication. We had some great speeds on the server side. We were having great testing here locally in Nashville. And then we had to get some user acceptance put into place.

With that, we did reach out to some plants to make sure that users within plants were able to get in and test and still able to do their gets, their checkouts, and everything was working in an acceptable instance.

So why we chose AWS-- the first big reason is the more robust data security. Our IT team has greatly been built up with network security, which leads a little bit more to the IT support. We have a lot more internal AWS personnel who is vastly more knowledgeable on this topic than I am. But they will let you know that our AWS security environment is very much more strict than our on-prem solutions to this day.

But with that, it gave us a simplified maintenance. We were responsible for everything on our on-prem VMs. But transitioning to AWS, we have the support of our AWS teams to help with the maintenance of these servers.

It's also led us to some faster upgrades. With some greater speeds, greater flexibility on server resources, we were able to cut our upgrade times down significantly. Another thing was our ease of global access. This solution gave us the ability to add in additional connections, if needed, to bring in other geolocations into our AWS instances.

And the last thing was the AWS backup and recovery. AWS has a great solution for their data backup and recovery scenario, but that's not where our data backup and recovery stops. Yes, we do utilize that. But our internal IT also has their own solution that they utilize.

But on top of that, us and our DM team, we have our own data backup and recovery solution that we utilize. We're able to create this multilevel tier of validation so that in a worst-case scenario, we are always going to have a backup to go back to and recover. That way, we don't lose as much data.

With that, we just want to give a huge thank-you to IMAGINiT and Autodesk. They are our partners in all of this. Without them, we wouldn't be where we are today.

They have a vast knowledge base, and we lean on them heavily on recommendations for not just the server side, but our client-side stuff as well. Carlos, do you have anything to add?

CARLOS CAMINOS: Yes. It'd be selfish to say that we came up with the architecture ourselves. It'd be selfish to say that it was all straightforward. It required a lot of planning. It required us to have a relationship, both with our reseller-- in this case, IMAGINiT and Autodesk, in order to get their feedback, their experience, and make sure that technically everything was the way it should be.

And I'll give you a quick example of that. Originally, we have a cloud team that recommended an architecture which we weren't aware of. We don't know AWS in depth. So we thought we should take this back to our partners at Autodesk.

And we reviewed it with them, and it turned out that was not a solution that we would have been successful with. So we had to go back and change it and have several meetings again to make sure everybody understood why the architecture needed to be in a specific manner.

JOSH WILSON: So you might be asking yourself, how's it going? We transitioned one of our largest Vault instances up to AWS. Let's get into a little bit of it.

Our general performance of this instance of Vault in AWS is we have a 2.1-terabyte initially transferred size of vaulted data in Vault. And it's growing every day. I think today we're up to about 2.5, 2.6 terabytes of data.

In North America, we have an average ping rate of 37 milliseconds, which is phenomenal. We couldn't have asked for anything better. We have an average checkin and checkout time of assemblies. Over 1,000 parts of checkin is just about 5 minutes, and checkout is about a minute. The majority of that checkin time is us creating visualizations locally and all that stuff.

But you might be asking yourself, what about our AVFS servers? So since we transitioned to AWS, we've had no real need to re-implement our AVFS servers. We're seeing a significant decrease in our ping rates. Data transfer speeds are fine, where nobody's complaining about a lag or any sort of performance issues.

So we've just managed to not re-implement them, which has greatly helped us out on our ease of upgrades and ease of maintenance, because we don't have these 9 and 10 other AVFS servers that we have to worry about.

Now, the general server specs for this instance is it is running Windows Server 2019. We do have a 2.2-gigahertz AMD processor for each one of these servers. Now, this is for both our ADMS server and our SQL server. They both are running 64 gig of RAM. Our ADMS server has about 12 terabytes of total disk space.

And this is going to allow us to have multiple drive redundancies and partition things out the way that we need to for having OS on a dedicated drive, our applications on a dedicated drive, our data on a dedicated drive, and our backups on a dedicated drive. And then, like I mentioned earlier, we do have that dedicated SQL server with the similar specs other than disk space.

So with that being said, that brings us to our 2024 upgrade. Now, with the 2024 upgrade, that brought a big decision for us because we still had five servers hosted locally on VMs in our data center. So what we looked at was, do we stay on our local VMs, or do we transition these to AWS?

If we stayed on our local VMs, there were some things we had to look at, first being our current server OS wasn't going to be supported by 2024. Now we had a choice. Do we do an in-place OS upgrade? Do we spin up a new VM and transfer data? We had to go through and estimate our upgrade time based on historical information that we've kept and maintained.

Is this going to play a factor in our current backup and restore plan, our data recovery plan, our DR plan that we've already put in place for our AWS? And then we also had the data securities that IT has been talking about. Are we going to be able to meet and maintain these data security constraints and requirements that IT's given us?

Now, if we look at our transition to AWS, the first thing that we had to look at is our glaring user acceptance of our current AWS server. We haven't had much, if any, negative feedback from our users of this server in AWS. We had to estimate our upgrade time for this largest upgrade. And looking back, historically, we were significantly lower. I think we cut our upgrade time down by 2 days, complete days.

We already have a significant, robust backup and restore plan for our AWS environment. We're able to maintain this high-level data security that IT is pushing down on us that we need to make sure that we're maintaining. And then the other thing that it's going to give us is the ability to have this dedicated dev environment for any additional testing we might want to do.

So with that, we did decide to go to AWS. We were going to transition all of our on-prem VMs to AWS. But to do so, we did have to go through a security controls assessment. Carlos?

CARLOS CAMINOS: Yes. This is one of my favorite topics here. This was probably one of the most difficult experiences that I've had to go through, and I've gone through a couple in my life. I had implemented other concepts before.

But you have to understand the risks here, and the company clearly understands the risk. We are asking to load our most important data to the cloud, right? And AWS and cloud technology is relatively new to everyone.

It's not necessarily a comfortable thing to even speak about in some companies. So let's go through some of the things that we had in place already.

So we had covered already, of course, if you use Vault, you can control permissions, documented access to servers. We implemented a two-factor authentication. We managed our own vulnerabilities and patch management and so forth, the regular things that, to Vault users, are standards, right?

But there's several other items I'm going to mention here. The two highest on the list were-- one was encryption while in transit, and the other one was data scraping. So in these meetings, we talked about a lot of things-- many, many types of securities, you know, who's accessing, how is that being controlled.

We needed to prepare documentation. We needed to write documentation for a business requirement for us to have admin rights. So this meant we needed to control admin rights to Vault, very tight admin rights. So there's only very few people that have the ability to move files or even delete files. Deleting files is a no-no, right?

And then there was a lot of things there that we weren't aware of. But we were committed to it, right? So we wanted-- since we were going to be the first, highest-ranked data going up on the cloud, we wanted to make sure we met the requirements that cloud security asked of us.

And they knew and we knew that we weren't sure if we could achieve those. But luckily, through a lot of collaboration, internal support-- there's a lot of things going on. The cloud security-- since we now have a cloud security team, that means we have an AWS team. That means we have other technologies that are being developed for the data that's being stored on the cloud.

So it turned out we have a service internally that does data scraping. It monitors everything that gets checked in, checked out, who it is, the size of the data, 100% of the time. It didn't affect us. It just meant that we needed to make sure we got on their list.

We gave them the information they needed, and that was really a flip of the switch. There was some testing, since we were one of the few. But this was a solution they provided already.

I do have to warn you, this process didn't happen overnight. It probably took us early Q1 to, like, mid-Q2 to get fully approved. We sat in several large meetings where it's like the gauntlet, you know, and people recognizing that Vault and what we provided was critical for the company. But also, that doesn't mean that we turn away or close our eyes to some other vulnerabilities that might exist.

So we addressed those upfront. And luckily, if there were 10 requirements, we met all 10, which is something that got published internally. It was a great, satisfying experience at the end of it.

And thanks to Josh, as my teammate with this-- I really just spoke more. He's really more the technical support person here. But we were able to get this through.

JOSH WILSON: So now let's talk a little bit about our transition to AWS for these remaining five servers. In doing so, we had to come up with a plan. We made our decision back in 2022 of Q4 that we were going to transition to on-prem. But we had a plan B. We still had our on-prem servers, and we were still going to keep them up until we were able to get all of our stuff to AWS.

And then, in 2023 Q1, we started our requirements definitions. This is where we started having these discussions with our internal IT and our AWS team, which led to our security controls assessment that Carlos just talked about. With this, our requirements were based off of the 2024 system requirements that came out from Autodesk in early April.

From there, we were able to have some more advanced discussions with our AWS team, and we started our testing in Q2 of 2023. We were able to get a dev environment set up in AWS for each one of these servers. We had a dedicated server for each business unit that we were able to install the ADMS and SQL on, these being a smaller subset of servers.

And then, from there, we were able to get some testing done internally within the data management team. And then we expanded that out a little bit to our ESS team, our entire department, to make sure that everybody in our department was seeing the same results we were.

Once we got some of that feedback, we expanded our testing out a little bit further to some individuals located in our manufacturing facilities across North America. We got some great feedback from them, and we decided it's time. We're ready to start creating our implementation strategy and what we were going to do.

From here, we were able to have some more discussions with our internal group and our IT team to develop this plan and how we were going to implement things. Now, we did come up with our dev testing server requirements. And this is what we defined to our team, to our internal IT and AWS team.

We wanted to go with a Windows Server 2022 data center OS. We ran the same 2.2-gigahertz AMD processor. We ran 32 gigs of RAM across all machines. And then we had disk space varying for each one of these servers, but with the same general configuration. We had four drives for dedicated information.

Our C drive is going to be our OS. Our D drive is going to be where we're going to install all of our apps. Our E drive is our data drive. And F would be backup.

But within these dev servers, there were some things that we installed and we wanted to test before we transitioned that to our production environment, first thing being our SSL configuration. We did not have SSL set up before. So we had to create our certificates.

And we initially created self-signed certificates. Then we also had some server performance questions we wanted to make sure we got answered, that the throughput was going to be fine, that the servers that we had specced out were going to be sufficient for the amount of users and data we were going to host on there.

And then lastly is our client connections. With this being AWS, we wanted to make sure that everybody was able to hit these servers, including the new SSL configuration. With those self-signed certificates for our dev testing environment, we had to go through and manually install these SSL certs.

How we have AWS laid out is-- I feel like it's pretty simple, but it's a complex, secure solution for us. We have our client connections on the left-hand side there.

Those clients connect directly to our internal WAN. That WAN connects to our firewall that's located in our data center. And then that firewall is set up with a direct connect to our private subnet hosted in AWS. So we are able to have this secure connection from internal network directly to our private subnet inside of AWS.

So now that we've got all of our dev testing out of the way, let's talk about our production migration plan. We have our testing complete. We were ready. We were comfortable.

We had our server. Our dev servers were up and ready. You know, they just needed transitioned over to production.

We had everything in place that we thought we needed. Then we had our final meeting with our AWS implementation team, and they throw a little curveball at us. They let us know that they can't just do a migration of these dev servers over to a production environment.

So they had to create and spin up new production-grade servers for us. And with that, we want to emphasize communication. Communication in any form, whether it's written or verbal, needs to be clear, effective, and efficient.

And not having this clear communication up front led us to this little bit of a miscommunication, which we could have avoided. So I just want to emphasize making sure that we are communicating with everybody involved within a project, within a migration, within an upgrade, anything, that everybody's aware of what's going on.

So now let's get to our actual production migration. We have our new production servers installed-- or spun up, our AWS instances spun up. We had about two weeks to get the new ADMSs installed, to get new SQL instances installed, to make sure that we got SSL set back up properly, and that we were able to get backups backed up and then moved over.

The one thing I will note with our SSL configurations is after talking with some more IT groups within Bridgestone, we found a better solution as opposed to a self-signed certificate. So our IT was able to create a more advanced and secure SSL certificate for us, and they were able to push that out through group policy. That way, we didn't have to go through and push out these self-signed certificates anywhere.

But what we did for our backups was we had weekly full backups that were running. So we ran a full backup on a Saturday. And then, on Monday, Tuesday, Wednesday, Thursday, we ran incrementals. Throughout the whole week, we ran incrementals.

Now, we took those after they completed and copied those to their respective AWS servers. And once we had all backups restored, or copied up to the production servers, the new AWS production servers, we needed to disable the on-prem VMs to make sure that nobody accidentally got into one of those legacy machines and made any changes to data that was no longer active.

So what we did was we just disabled IIS, and we disabled the SQL instance. This gave us a little bit of security that the general user isn't going to be able to get in and make any changes. But if we needed to get in, we could jump back into these servers, re-enable these services, and get in and get any information we need to.

Now we're on to our AWS restore. We spent the whole week getting all of our backups copied up to the servers. And at 5 o'clock on a Friday, we decided, we're shutting everything down, and we're going to start our restore process.

With that, we have a great team with us here. And we were able to, by noon on Sunday, have all five servers restored and up and running, tested, completely validated, by noon on Sunday. I think that's a great feat for all of these, and that also includes our main production in-place server upgrade to 2024.

Now, with that all being said, we have some key takeaways we want to go over. First is understand your current environment and needs. AWS might not be the right solution for you, but it was for us. So take and evaluate what you currently have in place. See if your security requirements fit your security needs.

Also, evaluate the costs versus benefit for your situation. AWS can be a pricey situation, can be a pricey solution. So make sure that you are doing that evaluation yourself to make sure it's the right solution for you.

Another one here is going to be test your DR plan. If you don't have a Vault DR plan in place, please put one in place. If you do have one, make sure you're testing it regularly because you never know when something can go wrong, will go wrong. And you want to make sure you understand what it is you have to do and what your company needs to do to get your data restored and be back in production as fast as possible.

Have clear, direct, and efficient communication with anybody involved within some of these migration projects. These can be very complex migrations. So make sure that you are communicating with everybody, letting everyone know what's going on, when it's going on, timelines, and when everything is needed by.

Last thing is Vault Professional works in AWS. It's working for us as this solution.

So let's connect. Up on the board here, we do have some QR codes that go directly to our LinkedIn. Feel free to connect with us. If you have any questions, reach out. We'll be more than happy to have additional conversations with you. Carlos, do you have anything to add?

CARLOS CAMINOS: Yeah, again, to support what Josh just said, communication is key here in all different aspects. Surround yourself with a good team. Make sure you have relationships with the resellers and with your vendor. Make sure that's a constant dialogue.

Reach out. Network. This is part of the reason why we're here, right? This is why we continue to come to AU. Feel free to send us an email or a message.

We're very active. Josh and I are both very active on social media. Feel free to reach out to us.

JOSH WILSON: And with that, thank you for joining us today. This has been "Vault Shining on a Clouded Day."

______
icon-svg-close-thick

Cookie 首选项

您的隐私对我们非常重要,为您提供出色的体验是我们的责任。为了帮助自定义信息和构建应用程序,我们会收集有关您如何使用此站点的数据。

我们是否可以收集并使用您的数据?

详细了解我们使用的第三方服务以及我们的隐私声明

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

通过这些 Cookie,我们可以记录您的偏好或登录信息,响应您的请求或完成购物车中物品或服务的订购。

改善您的体验 – 使我们能够为您展示与您相关的内容

通过这些 Cookie,我们可以提供增强的功能和个性化服务。可能由我们或第三方提供商进行设置,我们会利用其服务为您提供定制的信息和体验。如果您不允许使用这些 Cookie,可能会无法使用某些或全部服务。

定制您的广告 – 允许我们为您提供针对性的广告

这些 Cookie 会根据您的活动和兴趣收集有关您的数据,以便向您显示相关广告并跟踪其效果。通过收集这些数据,我们可以更有针对性地向您显示与您的兴趣相关的广告。如果您不允许使用这些 Cookie,您看到的广告将缺乏针对性。

icon-svg-close-thick

第三方服务

详细了解每个类别中我们所用的第三方服务,以及我们如何使用所收集的与您的网络活动相关的数据。

icon-svg-hide-thick

icon-svg-show-thick

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

Qualtrics
我们通过 Qualtrics 借助调查或联机表单获得您的反馈。您可能会被随机选定参与某项调查,或者您可以主动向我们提供反馈。填写调查之前,我们将收集数据以更好地了解您所执行的操作。这有助于我们解决您可能遇到的问题。. Qualtrics 隐私政策
Akamai mPulse
我们通过 Akamai mPulse 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Akamai mPulse 隐私政策
Digital River
我们通过 Digital River 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Digital River 隐私政策
Dynatrace
我们通过 Dynatrace 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Dynatrace 隐私政策
Khoros
我们通过 Khoros 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Khoros 隐私政策
Launch Darkly
我们通过 Launch Darkly 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Launch Darkly 隐私政策
New Relic
我们通过 New Relic 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. New Relic 隐私政策
Salesforce Live Agent
我们通过 Salesforce Live Agent 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Salesforce Live Agent 隐私政策
Wistia
我们通过 Wistia 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Wistia 隐私政策
Tealium
我们通过 Tealium 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Tealium 隐私政策
Upsellit
我们通过 Upsellit 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Upsellit 隐私政策
CJ Affiliates
我们通过 CJ Affiliates 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. CJ Affiliates 隐私政策
Commission Factory
我们通过 Commission Factory 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Commission Factory 隐私政策
Google Analytics (Strictly Necessary)
我们通过 Google Analytics (Strictly Necessary) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Strictly Necessary) 隐私政策
Typepad Stats
我们通过 Typepad Stats 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Typepad Stats 隐私政策
Geo Targetly
我们使用 Geo Targetly 将网站访问者引导至最合适的网页并/或根据他们的位置提供量身定制的内容。 Geo Targetly 使用网站访问者的 IP 地址确定访问者设备的大致位置。 这有助于确保访问者以其(最有可能的)本地语言浏览内容。Geo Targetly 隐私政策
SpeedCurve
我们使用 SpeedCurve 来监控和衡量您的网站体验的性能,具体因素为网页加载时间以及后续元素(如图像、脚本和文本)的响应能力。SpeedCurve 隐私政策
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

改善您的体验 – 使我们能够为您展示与您相关的内容

Google Optimize
我们通过 Google Optimize 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Google Optimize 隐私政策
ClickTale
我们通过 ClickTale 更好地了解您可能会在站点的哪些方面遇到困难。我们通过会话记录来帮助了解您与站点的交互方式,包括页面上的各种元素。将隐藏可能会识别个人身份的信息,而不会收集此信息。. ClickTale 隐私政策
OneSignal
我们通过 OneSignal 在 OneSignal 提供支持的站点上投放数字广告。根据 OneSignal 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 OneSignal 收集的与您相关的数据相整合。我们利用发送给 OneSignal 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. OneSignal 隐私政策
Optimizely
我们通过 Optimizely 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Optimizely 隐私政策
Amplitude
我们通过 Amplitude 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Amplitude 隐私政策
Snowplow
我们通过 Snowplow 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Snowplow 隐私政策
UserVoice
我们通过 UserVoice 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. UserVoice 隐私政策
Clearbit
Clearbit 允许实时数据扩充,为客户提供个性化且相关的体验。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。Clearbit 隐私政策
YouTube
YouTube 是一个视频共享平台,允许用户在我们的网站上查看和共享嵌入视频。YouTube 提供关于视频性能的观看指标。 YouTube 隐私政策

icon-svg-hide-thick

icon-svg-show-thick

定制您的广告 – 允许我们为您提供针对性的广告

Adobe Analytics
我们通过 Adobe Analytics 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Adobe Analytics 隐私政策
Google Analytics (Web Analytics)
我们通过 Google Analytics (Web Analytics) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Web Analytics) 隐私政策
AdWords
我们通过 AdWords 在 AdWords 提供支持的站点上投放数字广告。根据 AdWords 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AdWords 收集的与您相关的数据相整合。我们利用发送给 AdWords 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AdWords 隐私政策
Marketo
我们通过 Marketo 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。我们可能会将此数据与从其他信息源收集的数据相整合,以根据高级分析处理方法向您提供改进的销售体验或客户服务体验以及更相关的内容。. Marketo 隐私政策
Doubleclick
我们通过 Doubleclick 在 Doubleclick 提供支持的站点上投放数字广告。根据 Doubleclick 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Doubleclick 收集的与您相关的数据相整合。我们利用发送给 Doubleclick 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Doubleclick 隐私政策
HubSpot
我们通过 HubSpot 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。. HubSpot 隐私政策
Twitter
我们通过 Twitter 在 Twitter 提供支持的站点上投放数字广告。根据 Twitter 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Twitter 收集的与您相关的数据相整合。我们利用发送给 Twitter 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Twitter 隐私政策
Facebook
我们通过 Facebook 在 Facebook 提供支持的站点上投放数字广告。根据 Facebook 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Facebook 收集的与您相关的数据相整合。我们利用发送给 Facebook 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Facebook 隐私政策
LinkedIn
我们通过 LinkedIn 在 LinkedIn 提供支持的站点上投放数字广告。根据 LinkedIn 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 LinkedIn 收集的与您相关的数据相整合。我们利用发送给 LinkedIn 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. LinkedIn 隐私政策
Yahoo! Japan
我们通过 Yahoo! Japan 在 Yahoo! Japan 提供支持的站点上投放数字广告。根据 Yahoo! Japan 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Yahoo! Japan 收集的与您相关的数据相整合。我们利用发送给 Yahoo! Japan 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Yahoo! Japan 隐私政策
Naver
我们通过 Naver 在 Naver 提供支持的站点上投放数字广告。根据 Naver 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Naver 收集的与您相关的数据相整合。我们利用发送给 Naver 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Naver 隐私政策
Quantcast
我们通过 Quantcast 在 Quantcast 提供支持的站点上投放数字广告。根据 Quantcast 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Quantcast 收集的与您相关的数据相整合。我们利用发送给 Quantcast 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Quantcast 隐私政策
Call Tracking
我们通过 Call Tracking 为推广活动提供专属的电话号码。从而,使您可以更快地联系我们的支持人员并帮助我们更精确地评估我们的表现。我们可能会通过提供的电话号码收集与您在站点中的活动相关的数据。. Call Tracking 隐私政策
Wunderkind
我们通过 Wunderkind 在 Wunderkind 提供支持的站点上投放数字广告。根据 Wunderkind 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Wunderkind 收集的与您相关的数据相整合。我们利用发送给 Wunderkind 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Wunderkind 隐私政策
ADC Media
我们通过 ADC Media 在 ADC Media 提供支持的站点上投放数字广告。根据 ADC Media 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 ADC Media 收集的与您相关的数据相整合。我们利用发送给 ADC Media 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. ADC Media 隐私政策
AgrantSEM
我们通过 AgrantSEM 在 AgrantSEM 提供支持的站点上投放数字广告。根据 AgrantSEM 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AgrantSEM 收集的与您相关的数据相整合。我们利用发送给 AgrantSEM 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AgrantSEM 隐私政策
Bidtellect
我们通过 Bidtellect 在 Bidtellect 提供支持的站点上投放数字广告。根据 Bidtellect 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bidtellect 收集的与您相关的数据相整合。我们利用发送给 Bidtellect 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bidtellect 隐私政策
Bing
我们通过 Bing 在 Bing 提供支持的站点上投放数字广告。根据 Bing 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bing 收集的与您相关的数据相整合。我们利用发送给 Bing 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bing 隐私政策
G2Crowd
我们通过 G2Crowd 在 G2Crowd 提供支持的站点上投放数字广告。根据 G2Crowd 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 G2Crowd 收集的与您相关的数据相整合。我们利用发送给 G2Crowd 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. G2Crowd 隐私政策
NMPI Display
我们通过 NMPI Display 在 NMPI Display 提供支持的站点上投放数字广告。根据 NMPI Display 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 NMPI Display 收集的与您相关的数据相整合。我们利用发送给 NMPI Display 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. NMPI Display 隐私政策
VK
我们通过 VK 在 VK 提供支持的站点上投放数字广告。根据 VK 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 VK 收集的与您相关的数据相整合。我们利用发送给 VK 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. VK 隐私政策
Adobe Target
我们通过 Adobe Target 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Adobe Target 隐私政策
Google Analytics (Advertising)
我们通过 Google Analytics (Advertising) 在 Google Analytics (Advertising) 提供支持的站点上投放数字广告。根据 Google Analytics (Advertising) 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Google Analytics (Advertising) 收集的与您相关的数据相整合。我们利用发送给 Google Analytics (Advertising) 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Google Analytics (Advertising) 隐私政策
Trendkite
我们通过 Trendkite 在 Trendkite 提供支持的站点上投放数字广告。根据 Trendkite 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Trendkite 收集的与您相关的数据相整合。我们利用发送给 Trendkite 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Trendkite 隐私政策
Hotjar
我们通过 Hotjar 在 Hotjar 提供支持的站点上投放数字广告。根据 Hotjar 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Hotjar 收集的与您相关的数据相整合。我们利用发送给 Hotjar 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Hotjar 隐私政策
6 Sense
我们通过 6 Sense 在 6 Sense 提供支持的站点上投放数字广告。根据 6 Sense 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 6 Sense 收集的与您相关的数据相整合。我们利用发送给 6 Sense 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. 6 Sense 隐私政策
Terminus
我们通过 Terminus 在 Terminus 提供支持的站点上投放数字广告。根据 Terminus 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Terminus 收集的与您相关的数据相整合。我们利用发送给 Terminus 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Terminus 隐私政策
StackAdapt
我们通过 StackAdapt 在 StackAdapt 提供支持的站点上投放数字广告。根据 StackAdapt 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 StackAdapt 收集的与您相关的数据相整合。我们利用发送给 StackAdapt 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. StackAdapt 隐私政策
The Trade Desk
我们通过 The Trade Desk 在 The Trade Desk 提供支持的站点上投放数字广告。根据 The Trade Desk 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 The Trade Desk 收集的与您相关的数据相整合。我们利用发送给 The Trade Desk 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. The Trade Desk 隐私政策
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

是否确定要简化联机体验?

我们希望您能够从我们这里获得良好体验。对于上一屏幕中的类别,如果选择“是”,我们将收集并使用您的数据以自定义您的体验并为您构建更好的应用程序。您可以访问我们的“隐私声明”,根据需要更改您的设置。

个性化您的体验,选择由您来做。

我们重视隐私权。我们收集的数据可以帮助我们了解您对我们产品的使用情况、您可能感兴趣的信息以及我们可以在哪些方面做出改善以使您与 Autodesk 的沟通更为顺畅。

我们是否可以收集并使用您的数据,从而为您打造个性化的体验?

通过管理您在此站点的隐私设置来了解个性化体验的好处,或访问我们的隐私声明详细了解您的可用选项。