AU Class
AU Class
class - AU

Water Project Lifecycle Connected Data with Civil 3D, GIS, Dynamo, Python, and IFC

共享此课程

说明

This class will showcase a connected data workflow used to deliver large water and sewer infrastructure projects from design to commissioning. This is accomplished with a systematic use of Civil 3D software and ArcGIS integration, asset data governance, and Industry Foundation classes (IFC) for digital-twin commissioning. The sewer and water designs are first developed in Civil 3D. Dynamo and Python are used to attach a class library as property set data and generate a spreadsheet model database. The spreadsheet is linked to an Esri geodatabase through geoprocessing and Python. This geodatabase is updated with field-collected attribute data and as-built data via Esri Field Maps. The computational process is reversed to update the spreadsheet and subsequently the Civil 3D model. The model is delivered as-built, and an IFC version containing the asset data is exported for the owner's digital-twin ecosystem.

主要学习内容

  • Learn how to combine Civil 3D, Dynamo, Python, ArcGIS, and Esri products.
  • See how effective project data governance can facilitate asset management delivery.
  • Learn about automation with Dynamo and Python, including automating model-data manipulation using computational design methods.
  • Learn how to facilitate collaboration and information sharing between stakeholders.

讲师

  • Michael Desilets
    Michael Desilets is a Geospatial Professional who has been involved in the Energy, Environmental and Transportation sectors for the last 27 years. Most recently, Michael has been a technical director for large geospatial design and asset management projects across North America including; The Trans Mountain Pipeline, Renewable Zoning Bylaw Reviews for several municipalities and the Long Island Rail Road (LIRR) Third Track Expansion. Most of Michael's career has focused on the design of management tools that support the integration of complex geospatial data into enterprise information systems. He has designed or been directly involved in the execution of several systems to support regulatory compliance, engineering studies and linear corridor analysis. During the early part of his career, Michael was involved in the design and execution of predictive ecological mapping programs that integrated first generation machine learning techniques.
Video Player is loading.
Current Time 0:00
Duration 32:09
Loaded: 0.51%
Stream Type LIVE
Remaining Time 32:09
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

JUSTIN RACELIS: Welcome, everyone to CES600175. This is the water project lifecycle, connected data with civil 3D, GIS, Dynamo, Python, and IFC. So I'm Justin Racelis. I'm an infrastructure digital practice manager with Stantec. I help drive a lot of the initiatives related to design automation, and computational design, and emerging technologies within the infrastructure business at Stantec. And hey, Mike, do you want to introduce yourself real quick?

MICHAEL DESILETS: Thanks, Justin. Yeah, my name is Mike Desilets. I've been working with Stantec for the better part of 15 years. And I'm a geospatial business strategist. Basically, that's dealing with information systems and all types of geospatial technologies. Happy to be here today.

JUSTIN RACELIS: So safe harbor statement. I believe I'm supposed to say do not make any purchasing decisions based on any content that we provide here, et cetera. So customary at Stantec is to give something that we call a safety moment at the beginning of every presentation.

So I just want to tell everyone to be safe when booking rideshare or participating in any kind of rideshare service. Just because you may be coming from some event and think that you're safe because you're in a vehicle. And that may not be the case. You may need to intervene and direct or implore the driver to do a certain thing. Otherwise, you may end up in a situation such as getting involved in a car accident while in rideshare. So here at-- here at Vegas, people are going to be moving around all the time. So just stay safe out there.

All right, so to give a high level overview of what we're going to be discussing here with this presentation, we want to show how you can utilize transmitting data between various different platforms to create a connected data system within a software ecosystem that is not inherently connected. But we can do a lot of things to push and pull data and make some programmatic and intelligent decisions with our model and attribution based on any number of mechanisms.

So firstly, we'll-- just with design data development, with this example project that we'll showcase, we will be utilizing a model-based-- or information model that is was used for the design and documentation. And Mike, do you want to explain a little bit about the geodatabase?

MICHAEL DESILETS: Yeah, thanks, Justin. So we were going to go through the evolution of some of the change management scenarios that people often find in dealing with this life cycle situation. As we have moved through this a couple of times on some projects, the structures of relational databases definitely have their place. But we've also found that there needs to be more fluid ways of moving information through the life cycle in addition to the actual design process, from the beginning all the way to the end in delivery. So we were looking at an enhanced strategy of using a relational or geodatabase technology.

JUSTIN RACELIS: Yeah, thanks, Mike. So with that, and the design-- the information model that we're utilizing for the design, we're able to connect field data with the design model. And we're going to show how this can be done utilizing any number of techniques. But most importantly, this kind of streamlines a lot of the post-delivery to operations and maintenance requirements that we tend to see on a lot of projects these days.

And in order to protect the information of some of our clients who may have some security concerns, we're going to be showing a workflow that's derived from a number of things that we're doing across multiple projects, but in this case, applied to a sample project that's long been constructed in the past and designed by Stantec, just to give an idea of how one could use these concepts to do-- to develop a connected data environment with a similar type of project, or maybe even a completely different type of project.

So yeah, just to give an idea of a little bit of the technology stack that we're working with here. So we have some of the Autodesk products, the Civil 3D, Navisworks, Dynamo, as you can expect from a design environment. And then we have, for data management and attribution, we're using the ESRI suite and Excel tables when needed. And then for collaboration, we're utilizing a lot of computational methods through Python and packaging information with GeoJSON and IFC.

So this is that sample project that I had mentioned. It's just a small segment of a very large sewer network. And this is right in front of the World Trade Center. So this is something that, again, that Stantec had worked on quite some time ago. So you can see, it's a very large chamber connected to some pipes and some manholes.

And right here on the right side, you can see a lot of attribution has been assigned to these models. So what we're going to be showcasing here is what you can do with, in this case, Civil 3D property sets. But if you were working in Revit, you could-- this would be extensible storage in Revit.

Any way that you can use custom user defined attributes to match the client requirements for their asset schema, and yeah, just up here on the left side, that's the original plan sets and alongside an image taken right from the model. So that we can see a little bit of the context here.

So what we're doing with Dynamo essentially is we're taking all of this information. And we need to somehow package it in a way that it can be read as a GeoJSON by platforms that require the GeoJSON schema to be followed to a certain extent. So the first thing that we're doing is we're getting grabbing the pipe objects. And then some user inputs are needed, such as the target, the location of the GeoJSON files, and the list of the property names that were developed that we're using for the pipe network.

And then with Python, we're going to be doing a bunch of fun stuff with it. And yeah, so as I mentioned again, GeoJSON kind of-- it performs a very central role in what we're doing because effectively, as we're going to show over the next few minutes, we're going to-- we've developed a Python interpreter for Civil 3D pipe networks and GeoJSON, which that same concept could be applied to quite a number of things.

But in this case, we're working with pipe networks. And GeoJSON is used extensively within the industry. And Mike, do you want to give maybe a little bit of background on how this gets used on the geospatial side?

MICHAEL DESILETS: Absolutely, Justin. Yes, one of the good things that we had worked a lot on was looking at different types of proprietary formats. And there's been a lot of work between Autodesk and ESRI to deal with the integration there. So with doing that, looking at the kind of open data structure that JSON format had put out there, some of the more complex topology types that people depend on looked to be finally evolved within the GeoJSON format.

It was great because it's nice and lightweight. And kind of really works with a lot of the different platforms that we're dealing with. And instead of looking at how to fit and squeeze data from one design environment to the next, we ended up finding that this GeoJSON format, not only did it carry the rich attribution in a simple structure. The geometry was also able to be packaged. So very, very elegant solution. Yeah, thanks, Justin.

JUSTIN RACELIS: So yeah, as I mentioned before, we effectively have built a translator with Python. And it's very flexible. I mean, this is really just-- logically speaking, it's any number of ways that you can think of to take civil object data and instantiate it within the GeoJSON schema. So I'll just step through some of the functions that we had to build in order to get this workflow up and running.

So some of the main things that we needed to accomplish with this is the first thing is just identify the pipe network object. So look at what's in the model and identify the models that actually need to be transmitted via GeoJSON and ignore the rest. So that's really just parsing through lots of information.

The next would be to create the GeoJSON itself and then populate it with the geometric and property data. So yeah, we did that through a number of functions that we've defined in Python. So the first is-- it just needs to look through the object properties and values and collect them as dictionary keys and values.

And this portion right here was a little bit interesting because while we are using property sets to represent the asset attributes, there are a number of object properties that are not property sets that are used-- that need to be transmitted as well. So we kind of needed to look through both of those and figure out which ones are necessary and need to be transmitted.

And next, we just need to-- following that GeoJSON schema, we needed to build the GeoJSON header and then create the feature object itself and then go through all of the Civil 3D objects that have been identified and then populate-- format them and populate them as objects within that GeoJSON feature database.

And the result is just between these two right here, you're selecting the model in Civil 3D. That's the original property-- set of properties that I've highlighted here on the left side. And that's significant because we're going to keep-- we're going to keep revisiting these particular properties throughout the rest of the presentation. And on the right side, this is just the GeoJSON opened up in text editor.

So you can see the correlation between the two that the property sets that we needed have been exported and are mirrored inside of that GeoJSON format. So the end goal of doing this is collecting all this information and using it in GeoJSON. And we want to go to the field and do something with it. So that's our goal is we want to take the model and we want to use it somehow using the ESRI field apps and utilize some of the geospatial technologies that we have available to us. Yeah, and go ahead, Mike, you want to go ahead and take over and talk about the geodatabase management?

MICHAEL DESILETS: Yeah, that sounds great. So as I mentioned before, we understand the power of the relational database technology. ESRI relies on geodatabases for its rigid geometric management and attribute management. One of the things we had to do is respect a little bit of that, but trying to move the information from the design environments up into the field collection environments, as well as look at things within the GIS.

We decided to move things through to REST services. And we were able to easily do that with the GeoJSON import functions from ESRI. If you want to move to the next slide, Justin. We found out that we are really at a time where the REST services. And the combination of the GeoJSON structure has really enabled the accessibility to such a wide variety of technologies that by moving data into the REST service from the GeoJSON, now we're able to put in the hands of a wider audience or a wider group of users on a project, handheld devices, desktop applications, web applications, really a lot of independence and freedom there.

And this is just part of the evolution of client services and database technology into web-centric patterns as the graphics are showing. What that really means is we're not using proprietary formats anymore, one size fits all type of applications or platforms. And that's just increasing our collaboration. This gives us efficiencies in the data movement. And again, that's what you've been detailing, Justin, is how we move the geometry and the attribution in the simplest, most lightweight format. If you can move to the next slide, I'll just briefly touch on the quick functions available in the ESRI product.

The idea, if you haven't used REST services before, they've been around almost longer than and WFS standards. And they are a very powerful way to make sure rich geometry is usable in a variety of formats with ESRI's cloud services product, the ArcGIS online environment, which runs some of these lightweight mobile applications relies on REST services. So moving it from import from Civil 3D up to REST services is a very easy process with the GeoJSON format.

Yeah, so then what we do is we deal with web app-- web maps and web applications that can allow a user to go out in the field, edit information based on observed measurements. And one of the things to be very aware of is that if you're doing spatial proximity measurements, to make sure that you're using the right type of instrumentation and the right type of survey professionals. Thanks, Justin.

JUSTIN RACELIS: Oh yeah, sorry, so in this case, what we're showing doing here with the survey team is this is emulating a workflow in the field where, for example, a surveyor would come out. And during the as-built collection, data collection phase of a project, the surveyor would shoot the point here. For instance, if we needed to find a point that represented the insertion of a structure, and that varies differently from what was designed, the surveyor would traditionally would have provided that as red lines on a sheet and then had somebody in the office just draft up the changes in the plan.

But in this case, what we're doing is we're utilizing the ability to capture this information in the field. So rather than that information being typed-- just handwritten into some type of notepad or a piece of paper, here, we can just input it directly into the ESRI field maps. So yeah.

MICHAEL DESILETS: Maybe I'll just add on that as well is that these devices that typically had been used by more advanced survey grade exercises, we can now put some of the data collection that's more relevant to a wider group of project team members into smaller, more affordable devices, and still be able to capture a lot of attribution that might not be part of scope for survey-based operations. Thanks, Justin.

JUSTIN RACELIS: Yeah, sure. Yeah, so here are just some-- showing some of the points that get modified via the ESRI App. So on the left side, that's like the insertion point, the x and y. And then updated on the right side, this is, again, kind of emulating that surveyor as-built workflow of collecting the insertion points.

And then here, these are some attributes that I think are, again, we highlighted earlier when we selected the structure-- or the chamber element in the model. And in this case, these are some standard design criteria that are typical of New York City sewer chamber design. So anyone who's familiar with the area would kind of recognize this.

So the idea is that if-- the as-built of the chamber may have some slight variance from the original design. So if you want to capture what those new parameters are for that determine the shape of-- or the geometry of the chamber that can be done as well and collected as effectively as metadata through the ESRI field maps.

And yeah here, so moving onto model is as-built. So essentially, bringing that information back from the field, so as we showed talked about before, we took all of that information and in the field and updated the web map with it. And in this form, it's still-- all we're doing is we're just updating a simple point and polyline-- or point and line geometry in the map.

But because we're utilizing the attribution to transmit the design change-- or not the design-- the as-built changes to inform the designer, we're no longer dealing with heavy models and lossy conversions. It's just what changed. And then we're looking through what changed and using the Civil 3D functionality to update that.

So this is the JSON that we get back. And one of the interesting things here is that what we encountered is a limitation with web-- with the web services in ESRI in terms of the attribute field name character length. So even though a character limit of 255 is actually supported, in this case, it's only 31. So we did have to adjust for that a little bit in our workflow.

So what we ended up doing with that was basically just building that into the script. So it would understand that some of these attribute fields got truncated and correct that. So now, we're basically just doing the reverse of the workflow that we did initially in the beginning. We took the pipe objects. We needed to identify the ones that are being used in this particular scenario, and then grabbing the updated GeoJSONs, and then using Python to look through them and understand what has actually changed.

So here, yeah, going through some of the functions that we-- as part of our Python workflow for this side of the process is just loading the-- loading the GeoJSONs and grabbing those feature dictionaries, which is the way that they're kind of contained in GeoJSON. So then, it looks at every single structure. And this is something that's going to be specific to Civil 3D pipe network elements.

And so even though, technically, when you move a-- when you change the insertion point of a structure, it's going to move the pipe as well. So in the field, moving the-- inputting the data or inputting the updated coordinates for the structure wouldn't require us to also add the changes to the pipe because we're going to handle that on the Civil 3D side.

So it's going to look at all the structures and determine if the insertion point is different. It's going to just grab-- grab that structure and then move it based on the updated coordinates that are listed in the attributes in the GeoJSON. And then it's going to do the same thing but look at the properties.

So it finds all the properties that have changed. And then if they have changed, we assume in this case because we're going back to the model from GeoJSON that those are newer. And those are the ones that need to be revised. So then we'll go through all the objects once we identify those properties that need to be changed and then apply those functions to all of those Civil 3D objects.

Yeah, so this is just a quick video just showing, running the actual script, so finding those changes and importing them, and then that those properties that were highlighted, again, those were, again, what we're using in our example here. So that's-- you can see that those have been modified. So not only did the structure shift, but the properties that captured in the field also came through.

So this is kind of indicative of what we can do with this type of workflow. Where it's just transmitting parameters through GeoJSON and then utilizing the capabilities of the native modeling or information modeling platform to make those changes for you. So that, again, you don't deal with lossy model file format conversions.

And lastly, what do we do with that information now? So we have this updated model in Civil 3D. And one of-- at this point in the project, what you're doing is you need to deliver either as-built or archival level of model information for the client. So the client may have requirements that they need IFC 4.3. Or they need a geodatabase, which a lot of our clients do need that.

And the useful thing about this workflow is that the geodatabase is inherently part of it. So just by instantiating this process in the first place, we already have created effectively that geodatabase deliverable. And if we need to export to IFC, that can be done from Civil 3D directly once the as-built conditions have been captured.

And also if you, unfortunately, were still developing sheets like many of us are and you needed to update those sheets, updating the model if your views of the sewer were cut with sheets and Civil 3D with the sheet-- the sheet tools, then those would update as well once the model has been updated.

So yeah, moving on to just finally, what do we do with the post-delivery side, just sending those-- transmitting those IFCs or the geodatabases to the clients. And just to recap, what we did is we went over how we're taking design content from information models. We are capturing that somehow in a format that is ideal for geospatial applications. And then we are taking those changes in the field via the ESRI mobile apps. And then we're programmatically taking those changes and using Civil 3D to modify the original model to reflect those as-built conditions. And then finally, taking those-- that updated as-built model and delivering that in the formats that are needed for by our clients.

So just side by side right here, this is showing what the original and updated 3D model in the ESRI environment on the left side. So just showing that there are some capabilities on the ESRI-- on the ESRI platform to view 3D models as well. And then on the right side, this is just the IFC export. So this is in the Open IFC Viewer. And again, looking at those properties that-- all of that-- all those property sets come through. So in this case, we're-- this is asset attributes.

So if you have a client that have dictated a certain set of asset templates that you need to follow, you can use property sets to-- as the foundation for that in your workflow and then use a workflow like this to exchange information with different phases of the project. So construction, as-built, et cetera.

And as far as the business case for this, I mean, again, there are a lot of clients that do have IFC deliverables where they want different schemas. That they want to follow different schemas and have them delivered as IFC models. There are geospatial deliverables that this would align with.

So especially with some of the ways that the ASCE 38 and by year spec would kind of require some of the properties to be instantiated in the field for utility mapping and things like that, I mean, all of those are directly applicable using this workflow. And then of course it streamlines your whole as-built process. So no more red lines-- drawing red lines on a plan.

I mean, if this was a different type of infrastructure project that utilized a different-- was using different types of models, I mean, there are-- there are ways that this workflow can be repurposed for that as well. And I think I'll touch on that a little bit at the end. And yeah, Mike, do you want to give a little blurb about mixed reality?

MICHAEL DESILETS: Yeah, thanks a lot, Justin. So what we wanted to really do is really push the envelope with some of the asset management deliverables that we had as well, which Justin just alluded to, there are standard requirements for some of those. They don't necessarily require the same level of geometric precision or even the structure of it, but the qualification of the assets themselves.

But with the process that we've been detailing, what we are also then finding with something, especially like an IFC, native to a lot of applications that can work with virtual reality, augmented reality technology very well. So what we can do is also introduce the idea of asset management or public engagement opportunities at any stage of the project with the designs. And that in this particular case, this picture on the side, we're just kind of showing that you can utilize some pretty potent BIM or IFC deliverables to allow for reality comparisons.

So really, got a quiz here for everybody. Just throw it out there, and say, based on the picture that you got here, can you tell us which one is the real one and which one is the augmented reality building? And we can definitely work through and go through which one is which, but just trying to highlight that the realism behind these models now is getting so great, the technology is so great that, really, this should help a wide variety of model verification as well as any number of means from marketing to public engagement. Yeah, that was exciting for us as well. Thanks, Justin.

JUSTIN RACELIS: Yeah, thanks, Mike. And lastly, again, I did mention that I wanted to show how you could use this similar type of workflow and apply it to a different-- to a different type of infrastructure or a different scenario. So as a quick example, this on the right side, this is a very simple parametric subassembly that is designed to identify the different retaining wall cases in this theoretical location where a retaining wall is needed.

It will design the foundation for you, essentially. So what would we want to do with this? Well, a lot of times, we have some transit and transportation owners that are developing GIS and assets requirements for even linear features like these, which are a little bit-- something that's a little bit difficult to capture in just a simple point and line geometry.

But the attribution, if that's really where we want to get smart about how we convey that via a polygon, what we can do is we can take this use the same workflow and embed that attribution that represents the geometry and the type of foundation in a similar fashion through a polygon entity within ESRI web map.

So that corridor, for instance, imagine running through the same workflow here by building out a process for grabbing the alignment of that wall and then also the attribution that the subassembly was tailored to be able to output for readability within Dynamo. We can take that same linear feature and then using a different function, just take those points and collate them in a fashion so that each segment captures the start and the end point of the retaining wall design.

And then the same process, export to geoJSON, and then upload that to the web-- to the web map. And yeah, here on the left side, it's just a pretty similar scenario. It's just now you're out in the field. And you have the ability to take a look at a retaining wall that's represented in plan, just a simple line.

But if there's information that you want to capture about it during construction, you could do a similar workflow here. So there's any CAD or based-- CAD or BIM-based system that has the ability to scripting, especially with Python or any other language that has the-- makes it easy to communicate information via external text files, this is some of the ways-- some of the things that you could do with that. And yeah, so that's all the content that we had. So yeah, thanks for listening. Thanks, everyone.

MICHAEL DESILETS: Thank you.

______
icon-svg-close-thick

Cookie 首选项

您的隐私对我们非常重要,为您提供出色的体验是我们的责任。为了帮助自定义信息和构建应用程序,我们会收集有关您如何使用此站点的数据。

我们是否可以收集并使用您的数据?

详细了解我们使用的第三方服务以及我们的隐私声明

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

通过这些 Cookie,我们可以记录您的偏好或登录信息,响应您的请求或完成购物车中物品或服务的订购。

改善您的体验 – 使我们能够为您展示与您相关的内容

通过这些 Cookie,我们可以提供增强的功能和个性化服务。可能由我们或第三方提供商进行设置,我们会利用其服务为您提供定制的信息和体验。如果您不允许使用这些 Cookie,可能会无法使用某些或全部服务。

定制您的广告 – 允许我们为您提供针对性的广告

这些 Cookie 会根据您的活动和兴趣收集有关您的数据,以便向您显示相关广告并跟踪其效果。通过收集这些数据,我们可以更有针对性地向您显示与您的兴趣相关的广告。如果您不允许使用这些 Cookie,您看到的广告将缺乏针对性。

icon-svg-close-thick

第三方服务

详细了解每个类别中我们所用的第三方服务,以及我们如何使用所收集的与您的网络活动相关的数据。

icon-svg-hide-thick

icon-svg-show-thick

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

Qualtrics
我们通过 Qualtrics 借助调查或联机表单获得您的反馈。您可能会被随机选定参与某项调查,或者您可以主动向我们提供反馈。填写调查之前,我们将收集数据以更好地了解您所执行的操作。这有助于我们解决您可能遇到的问题。. Qualtrics 隐私政策
Akamai mPulse
我们通过 Akamai mPulse 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Akamai mPulse 隐私政策
Digital River
我们通过 Digital River 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Digital River 隐私政策
Dynatrace
我们通过 Dynatrace 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Dynatrace 隐私政策
Khoros
我们通过 Khoros 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Khoros 隐私政策
Launch Darkly
我们通过 Launch Darkly 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Launch Darkly 隐私政策
New Relic
我们通过 New Relic 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. New Relic 隐私政策
Salesforce Live Agent
我们通过 Salesforce Live Agent 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Salesforce Live Agent 隐私政策
Wistia
我们通过 Wistia 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Wistia 隐私政策
Tealium
我们通过 Tealium 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Tealium 隐私政策
Upsellit
我们通过 Upsellit 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Upsellit 隐私政策
CJ Affiliates
我们通过 CJ Affiliates 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. CJ Affiliates 隐私政策
Commission Factory
我们通过 Commission Factory 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Commission Factory 隐私政策
Google Analytics (Strictly Necessary)
我们通过 Google Analytics (Strictly Necessary) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Strictly Necessary) 隐私政策
Typepad Stats
我们通过 Typepad Stats 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Typepad Stats 隐私政策
Geo Targetly
我们使用 Geo Targetly 将网站访问者引导至最合适的网页并/或根据他们的位置提供量身定制的内容。 Geo Targetly 使用网站访问者的 IP 地址确定访问者设备的大致位置。 这有助于确保访问者以其(最有可能的)本地语言浏览内容。Geo Targetly 隐私政策
SpeedCurve
我们使用 SpeedCurve 来监控和衡量您的网站体验的性能,具体因素为网页加载时间以及后续元素(如图像、脚本和文本)的响应能力。SpeedCurve 隐私政策
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

改善您的体验 – 使我们能够为您展示与您相关的内容

Google Optimize
我们通过 Google Optimize 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Google Optimize 隐私政策
ClickTale
我们通过 ClickTale 更好地了解您可能会在站点的哪些方面遇到困难。我们通过会话记录来帮助了解您与站点的交互方式,包括页面上的各种元素。将隐藏可能会识别个人身份的信息,而不会收集此信息。. ClickTale 隐私政策
OneSignal
我们通过 OneSignal 在 OneSignal 提供支持的站点上投放数字广告。根据 OneSignal 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 OneSignal 收集的与您相关的数据相整合。我们利用发送给 OneSignal 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. OneSignal 隐私政策
Optimizely
我们通过 Optimizely 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Optimizely 隐私政策
Amplitude
我们通过 Amplitude 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Amplitude 隐私政策
Snowplow
我们通过 Snowplow 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Snowplow 隐私政策
UserVoice
我们通过 UserVoice 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. UserVoice 隐私政策
Clearbit
Clearbit 允许实时数据扩充,为客户提供个性化且相关的体验。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。Clearbit 隐私政策
YouTube
YouTube 是一个视频共享平台,允许用户在我们的网站上查看和共享嵌入视频。YouTube 提供关于视频性能的观看指标。 YouTube 隐私政策

icon-svg-hide-thick

icon-svg-show-thick

定制您的广告 – 允许我们为您提供针对性的广告

Adobe Analytics
我们通过 Adobe Analytics 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Adobe Analytics 隐私政策
Google Analytics (Web Analytics)
我们通过 Google Analytics (Web Analytics) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Web Analytics) 隐私政策
AdWords
我们通过 AdWords 在 AdWords 提供支持的站点上投放数字广告。根据 AdWords 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AdWords 收集的与您相关的数据相整合。我们利用发送给 AdWords 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AdWords 隐私政策
Marketo
我们通过 Marketo 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。我们可能会将此数据与从其他信息源收集的数据相整合,以根据高级分析处理方法向您提供改进的销售体验或客户服务体验以及更相关的内容。. Marketo 隐私政策
Doubleclick
我们通过 Doubleclick 在 Doubleclick 提供支持的站点上投放数字广告。根据 Doubleclick 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Doubleclick 收集的与您相关的数据相整合。我们利用发送给 Doubleclick 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Doubleclick 隐私政策
HubSpot
我们通过 HubSpot 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。. HubSpot 隐私政策
Twitter
我们通过 Twitter 在 Twitter 提供支持的站点上投放数字广告。根据 Twitter 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Twitter 收集的与您相关的数据相整合。我们利用发送给 Twitter 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Twitter 隐私政策
Facebook
我们通过 Facebook 在 Facebook 提供支持的站点上投放数字广告。根据 Facebook 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Facebook 收集的与您相关的数据相整合。我们利用发送给 Facebook 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Facebook 隐私政策
LinkedIn
我们通过 LinkedIn 在 LinkedIn 提供支持的站点上投放数字广告。根据 LinkedIn 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 LinkedIn 收集的与您相关的数据相整合。我们利用发送给 LinkedIn 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. LinkedIn 隐私政策
Yahoo! Japan
我们通过 Yahoo! Japan 在 Yahoo! Japan 提供支持的站点上投放数字广告。根据 Yahoo! Japan 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Yahoo! Japan 收集的与您相关的数据相整合。我们利用发送给 Yahoo! Japan 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Yahoo! Japan 隐私政策
Naver
我们通过 Naver 在 Naver 提供支持的站点上投放数字广告。根据 Naver 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Naver 收集的与您相关的数据相整合。我们利用发送给 Naver 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Naver 隐私政策
Quantcast
我们通过 Quantcast 在 Quantcast 提供支持的站点上投放数字广告。根据 Quantcast 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Quantcast 收集的与您相关的数据相整合。我们利用发送给 Quantcast 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Quantcast 隐私政策
Call Tracking
我们通过 Call Tracking 为推广活动提供专属的电话号码。从而,使您可以更快地联系我们的支持人员并帮助我们更精确地评估我们的表现。我们可能会通过提供的电话号码收集与您在站点中的活动相关的数据。. Call Tracking 隐私政策
Wunderkind
我们通过 Wunderkind 在 Wunderkind 提供支持的站点上投放数字广告。根据 Wunderkind 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Wunderkind 收集的与您相关的数据相整合。我们利用发送给 Wunderkind 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Wunderkind 隐私政策
ADC Media
我们通过 ADC Media 在 ADC Media 提供支持的站点上投放数字广告。根据 ADC Media 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 ADC Media 收集的与您相关的数据相整合。我们利用发送给 ADC Media 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. ADC Media 隐私政策
AgrantSEM
我们通过 AgrantSEM 在 AgrantSEM 提供支持的站点上投放数字广告。根据 AgrantSEM 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AgrantSEM 收集的与您相关的数据相整合。我们利用发送给 AgrantSEM 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AgrantSEM 隐私政策
Bidtellect
我们通过 Bidtellect 在 Bidtellect 提供支持的站点上投放数字广告。根据 Bidtellect 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bidtellect 收集的与您相关的数据相整合。我们利用发送给 Bidtellect 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bidtellect 隐私政策
Bing
我们通过 Bing 在 Bing 提供支持的站点上投放数字广告。根据 Bing 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bing 收集的与您相关的数据相整合。我们利用发送给 Bing 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bing 隐私政策
G2Crowd
我们通过 G2Crowd 在 G2Crowd 提供支持的站点上投放数字广告。根据 G2Crowd 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 G2Crowd 收集的与您相关的数据相整合。我们利用发送给 G2Crowd 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. G2Crowd 隐私政策
NMPI Display
我们通过 NMPI Display 在 NMPI Display 提供支持的站点上投放数字广告。根据 NMPI Display 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 NMPI Display 收集的与您相关的数据相整合。我们利用发送给 NMPI Display 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. NMPI Display 隐私政策
VK
我们通过 VK 在 VK 提供支持的站点上投放数字广告。根据 VK 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 VK 收集的与您相关的数据相整合。我们利用发送给 VK 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. VK 隐私政策
Adobe Target
我们通过 Adobe Target 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Adobe Target 隐私政策
Google Analytics (Advertising)
我们通过 Google Analytics (Advertising) 在 Google Analytics (Advertising) 提供支持的站点上投放数字广告。根据 Google Analytics (Advertising) 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Google Analytics (Advertising) 收集的与您相关的数据相整合。我们利用发送给 Google Analytics (Advertising) 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Google Analytics (Advertising) 隐私政策
Trendkite
我们通过 Trendkite 在 Trendkite 提供支持的站点上投放数字广告。根据 Trendkite 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Trendkite 收集的与您相关的数据相整合。我们利用发送给 Trendkite 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Trendkite 隐私政策
Hotjar
我们通过 Hotjar 在 Hotjar 提供支持的站点上投放数字广告。根据 Hotjar 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Hotjar 收集的与您相关的数据相整合。我们利用发送给 Hotjar 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Hotjar 隐私政策
6 Sense
我们通过 6 Sense 在 6 Sense 提供支持的站点上投放数字广告。根据 6 Sense 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 6 Sense 收集的与您相关的数据相整合。我们利用发送给 6 Sense 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. 6 Sense 隐私政策
Terminus
我们通过 Terminus 在 Terminus 提供支持的站点上投放数字广告。根据 Terminus 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Terminus 收集的与您相关的数据相整合。我们利用发送给 Terminus 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Terminus 隐私政策
StackAdapt
我们通过 StackAdapt 在 StackAdapt 提供支持的站点上投放数字广告。根据 StackAdapt 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 StackAdapt 收集的与您相关的数据相整合。我们利用发送给 StackAdapt 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. StackAdapt 隐私政策
The Trade Desk
我们通过 The Trade Desk 在 The Trade Desk 提供支持的站点上投放数字广告。根据 The Trade Desk 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 The Trade Desk 收集的与您相关的数据相整合。我们利用发送给 The Trade Desk 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. The Trade Desk 隐私政策
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

是否确定要简化联机体验?

我们希望您能够从我们这里获得良好体验。对于上一屏幕中的类别,如果选择“是”,我们将收集并使用您的数据以自定义您的体验并为您构建更好的应用程序。您可以访问我们的“隐私声明”,根据需要更改您的设置。

个性化您的体验,选择由您来做。

我们重视隐私权。我们收集的数据可以帮助我们了解您对我们产品的使用情况、您可能感兴趣的信息以及我们可以在哪些方面做出改善以使您与 Autodesk 的沟通更为顺畅。

我们是否可以收集并使用您的数据,从而为您打造个性化的体验?

通过管理您在此站点的隐私设置来了解个性化体验的好处,或访问我们的隐私声明详细了解您的可用选项。