AU Class
AU Class
class - AU

Bringing Large-Scale PMC to the Next Level by Maximizing the Power of Autodesk Construction Cloud

共享此课程

说明

In this class, we'll discuss how Autodesk Construction Cloud can help a project management consultancy be more efficient by tracking project key performance indicators (KPIs) related to schedule and cost control through Autodesk Docs, Autodesk Takeoff, Autodesk Build, and Autodesk Construction Cloud Connect. We'll take a closer look at how users can save time and easily track deliverables into Autodesk Docs and Autodesk Build automatically. By combining Autodesk Construction Cloud (cloud platform) to Autodesk Construction Cloud Connect (ready-to-use integration platform), users can create their own custom applications using "recipes,” and follow the delivered models and drawings. We'll also learn how to track the evolution of design quantity takeoffs based on the Autodesk takeoff to make strategic and efficient business decisions relying on real-time project data. By harnessing the power of Autodesk Construction Cloud connected to Autodesk Construction Cloud Connect, we'll learn how to store the extracted data in an SQL database and automatically connect it to a Power BI dashboard streamlining proficient reporting.

主要学习内容

  • Learn how a PMC saved time by tracking daily Autodesk Construction Cloud deliverables on a large-scale project.
  • Learn how to automate data extraction from Autodesk Docs, Autodesk Build, and Autodesk Takeoff to ensure data-driven insights.
  • Learn how to connect data from Autodesk Docs, Autodesk Build, and Autodesk Takeoff to SQL database and Power BI to track and measure project KPIs.
  • Learn how to create customized "recipes” in Autodesk Construction Cloud Connect.

讲师

  • Hafsa SADAKA 的头像
    Hafsa SADAKA
    Hafsa Sadaka is an experienced BIM manager with a civil engineering background. Hafsa is managing BIM projects within the nuclear department of Egis Group a worldwide multi-disciplinary engineering firm. She started working as a BIM MANAGER on the Hinckley point C project (a Nuclear Power Station) with the design team at first, then she joined the site support team working in UK as a BIM manager lead. She is currently working on a major project in Saudi Arabia and is in charge of the BIM project owner advisory. Her experience in large scale and complex projects has allowed her to strengthen her knowledge in terms of BIM Workflows and digital delivery approach, to better serve the projects' BIM requirements and go beyond following the digital evolution trends.
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
  • Chapters
  • descriptions off, selected
  • subtitles off, selected
      Transcript

      ARNOLD LEDAN: Hello and welcome everybody to this Autodesk University class about bringing large-scale PMC to the next level by maximizing the power of Autodesk cloud construction. This class will be held today by Hafsa Sadaka and me.

      SADAKA HAFSA: Let me introduce myself first. So I'm Sadaka Hafsa, BIM manager and civil engineer. I have been working at Egis Group for the nuclear department since 2019. I started working as a BIM manager on the HPC project, Hinckley point C project with the design team at first. Then I have joined the site support team working from UK as a BIM manager lead. And currently, I'm on some major project in the Middle East as a BIM manager always.

      ARNOLD LEDAN: And me, I am Arnold Ledan. I am a BIM manager and civil engineer. So today, I'm working on the large project in infrastructure and in France and in the Middle East.

      So we will start to present Egis Group, of course. Our class will be divided into two main parts. In the first part, we will look at how to harness ACC as a PMC for the rich data-driven insight. And during this part, we will define what is ACC. In each different module, which will allow, a better understanding of the second part of this class. And the second part will be about how to connect ACC to ACC connect. So we will have a chance to talk about our general workflow, and how we were able to extract the data from the different ACC module.

      So let's start first by talking about Egis Group. Egis, who are we? Egis is an international engineering company, which is active in the consulting, construction, and mobility sector. Here, we have some numbers which demonstrate our strength as a group, which is based in France. Last year, we have a turnover of more than 1 billion euro. And also, we are based in France, more than 65% of our activity are international. We are positive as the number one engineering in France, the eighth in Europe, and we have more than 18,000 employees worldwide.

      Our activity covers many different subjects, such sustainable city, transport, water, and energy, and go through different fields of activity, such as environment, mobility, complex structure, and digital engineering. This slide illustrates our global reach. As a people first company, we understand that the solution to the global challenge that we face today, such as the climate emergency, must also be global. With our office in many countries, we can work locally on this challenge.

      So now, let's go in a little deeper into the subject and see how we can use the ACC as a PMC to obtain data-rich insights. And I will like the floor to Hafsa to present the rest of the class.

      SADAKA HAFSA: Hello, everyone. So first of all, let's start by setting up the context for our class. Why did we choose as a PMC to implement the solution that we are going to talk about today?

      To support the common data environment and digital construction on a major project, we were looking for a complete construction management platform with all data in one central location to streamline collaboration between the design and the construction teams and provide data-driven guidance to improve the quality of our delivered products. So that's why we decided to work with ACC and use its range of product-- Docs, Build, and Takeoff that we're going to see later. Also, given the huge amount of data that was shared throughout our project on ACC that needed to be analyzed and managed each time, we automated-- we try to automate the operation using the ACC products. But also, we try to integrate ACC connect into our workflow to save time.

      So before getting to the core of our class, we will need to start by establishing some definitions. And we will start by defining the Autodesk Construction Cloud, ACC. So for anyone who doesn't have the chance to work with ACC before, here is a definition of where it is.

      The Autodesk Construction Cloud, known as ACC, is a cloud platform that unifies solutions throughout the project lifecycle, providing a synergy between parties to ensure that the projects are delivered on time and on budget. For short, it's a fully unified platform that allows teams to connect data. And what I can add also for those who have already worked with BIM 360, ACC is based on the BIM 360 CDE, Common Data Environment. And it will enable the whole team to collaborate around an integrated data set as for the BIM 360. And of course, it's still in line with the ISO-19650 requirements.

      As I mentioned at the beginning when talking about the context of our class, we use different ACC products. The first one is ACC Docs, which allows the user to view, organize, distribute, and share files throughout the project lifecycle, of course, but with a single document management platform. The ACC Docs interface looks like this, as you can see on the picture on the right.

      The second product that we have used is ACC Build, which will enable us to combine solid project management capabilities with a powerful and simple field collaboration tools. The main functions of the software are mainly sharing the construction drawings needed for the construction and thus, tracking the work progress onsite. It allows also to organize comments and decisions and creating checklists and forms that can be used later also onsite.

      The third and the last software that we have used is ACC Takeoff. This software will enable us to create competitive bids more quickly by performing more accurate 2D Takeoffs and also by generating automatic quantities from 3D models. The main function of the software is to generate quantities based on 2D drawings or/and 3D models. It may also help us to create, for some complex cases, some custom formulas to generate the Takeoff. And we can also use several predefined or customized classification systems for all the projects into this software.

      So for your information, there are other ACC products. But for this class, we'll just use these three products, which fall within our scope of work. So the first one, ACC Docs, was used to track all models and documents delivered by the designer. The second one, ACC Build, was used to track construction drawings delivered by the designer. And the third one, which is ACC Takeoff, was used to track quantities in delivered models.

      Again, in the same context as I mentioned at the beginning of this first part, we have integrated additionally to the ACC products, ACC Connect, into our workflow to save time but also to automate some data flows from ACC. So first, we will need also to define what is ACC Connect. Autodesk Construction Cloud Connect, called also ACC Connect, is an evolution of the PlanGrid Connect. And it will allow users to create flexible integrations without coding.

      ACC Connect is powered by Workato. And it supports integration of some Autodesk platforms, such as ACC Assemble, BIM 360. And then it will allow the users to connect their platforms with other softwares and also to automate ongoing or planned data workflows in order to provide solution for some specific activities, such as sharing and tracking the progress of deliverables on the Autodesk Construction Cloud or the BIM 360, for example.

      So now, let's move to some practical aspects. So that we can use ACC Connect, it needs to be linked correctly to ACC first. And then the first step is to activate ACC Connect application directly on ACC using an administrator account. And that's exactly what we're going to see into this video.

      So to activate ACC Connect directly into ACC, we will need admin permissions first. And we'll need to connect through an admin account to go into Applications tab. Look for the ACC Connect application. And activate it directly into the application. It's so easy.

      Now that we know how to activate ACC Connect application, we will start learning a little more about using this tool. And so to create instructions that automate workflows using ACC Connect, the user will need to manipulate two main things-- connections and recipes. And we need to define what is connections into ACC Connect, and what are recipes into ACC Connect. The connections will secure access to applications, such as ACC, SharePoint, SQL servers, et cetera. And each connector includes authentication methods and will provide access to these applications and other services.

      The recipes are user-developed automated workflows that can cover multiple applications to accomplish a specific purpose, such as extracting data from ACC and storing it, for example, on a database server. So now, we'll look at two demonstration of how to create a connection, and how to introduce an easy recipe into ACC Connect.

      To create a connection or even a recipe using ACC Connect, we'll need to start by opening the Workato platform and also by creating a working folder. And then to introduce a connection, we'll need to click after that on Create and choose the Connection option. Then we'll search for the connector we want to use. In our case, we'll select the ACC Connector.

      Then we'll click on the Connect button to establish the connection. And finally, you can check the connection status based on the message shown in the screen. Now we have just seen how to create a connection. Let's see how to create a recipe.

      So to create a recipe, we'll need first to navigate to the Assets and select our working folder. And then we will need to click on Create Select Recipe. We have to enter after that the name of the recipe. And then click on Start Building.

      The next step is to set up the trigger. In our case, we'll choose the ACC Trigger New or Updated Objects in ACC. And after that, we will have to fill in the necessary information to connect to the folder the user wants to track, like the account name, which is the hub, the project name. We'll need to select which folder we want to track on ACC. And we'll need also to choose if we want to track sub-folders also or not. And finally, we'll need to establish a frequency of automated reset launch.

      So let's say that we are now quite familiar with how to create a connection and also with the principle of creating a recipe. In this class, we will look in detail at the creation and the use of four recipes. The first one is about uploading all ACC data into a SQL database.

      The second one is about tracking all documents added or updated in ACC Docs and loading them into a SQL database. The third one is about uploading all ACC Build data into a SQL database. And then the last one is about uploading the Takeoff inventories data into the SQL database.

      So we have just finished with the first part of our class. And now, we'll move to the second part on how to achieve the four recipes that I have just mentioned. So to address our need, we have pre-established a workflow. The aim of this workflow was to exploit the data published by the designer in the various ACC applications-- Docs, Build, and Takeoff. Then we're going to use ACC Connect to extract the data automatically on a weekly basis. Throw the recipes and connections created on it, of course. This data will be then stored in a SQL server and displayed on a Power BI dashboard, which is also communicated to our client to monitor the design's progress.

      So let's start with the first recipe about uploading ACC Docs data into the SQL database. Briefly, to summarize the workflow, we will start by extracting ACC Docs data using ACC Connect, saving them in a SQL server database so that they can be easily exploited and analyzed directly on Power BI. You will see that, in general, all the other recipes will follow the same workflow.

      However, before starting the data extraction from ACC Docs, there are a few points to bear in mind that are very important. The first point is regarding the folder organization into ACC Docs. So it's very important to properly organize the folder hierarchy and to ensure that there are a few sub-folders into the ACC Docs. Why? Because the number of sub-folders has an impact on data extraction time and also on the performance of the ACC Connect recipe.

      The second point is about the file naming. It's also very important to respect a naming convention, which will make later the data processing on Power BI much more easier. And the last point is we need to define a frequency for updating files on ACC Docs, which will make it easier to set up the trigger on the recipe. And also, it will enable us to achieve reliable automatic extraction.

      This first recipe will consist of two main parts. The first part will enable us to extract all the data linked to the folders on ACC Docs. Main trigger that allows us to extract this data is the following GET FOLDER CONTENT. This trigger was created based on the ACC API that are available into the Autodesk Platform Services and more precisely, by using this following GET request. We will try to get a closer look into this Autodesk Platform Service, especially this GET request.

      So this is the GET request that is used. And here into the APS platform, you will find all details needed to create any ACC trigger. And you will see that APS was used for all the other recipes.

      The second part of the recipe will check whether the folders contains files and then extracts their respective data. So it will just extract the file data stored into each folder. The main trigger that allows us to extract this data is the following, GET FILES. And it was also created based on the ACC API and more precisely, using this GET request that you can find also in APS with all its detail.

      So the following video will summarize how the recipe works. So for our demo, let's first check how many files we have on ACC. In our case, we have 44 files. And normally we should find the same number after that extraction.

      Now, let's launch the ACC recipe to upload all data to the SQL database. To launch a recipe, we will need to click on Start Recipe. And when the recipe is finished, we will just stop it by clicking on Stop Recipe.

      And now, let's switch to the SQL database to check that all the information have been correctly uploaded. We can now see that all the 44 files have been successfully uploaded. And we can later retrieve this information on the Power BI dashboard also.

      So here we will find the same number of files with all the data that we have retrieved. We won't go into too much detail about the Power BI for lack of time, but just to give you an example of the dashboard we created. So this dashboard will enable us to have some indicators and to track deliverables so we can see the number of files that were delivered per date. You can see here the timeline.

      We can have a specific information, like the number of IFCs that were delivered, number of NWD files delivered. And we can also filter per discipline. So each team can have the information needed separately.

      So once all the Docs that have been uploaded, it's time to track updates and new files into ACC Docs. The second recipe then will consist of tracking all the new and updated items. As with the recipe number one, the data can be stored in a database SQL server. And it will be displayed also in a Power BI dashboard.

      So this recipe is quite simple and consists of a first trigger available by default in Workato called New or Updated Document. This trigger is used to track any document added or updated in the selected ACC project. And then once the document has been detected, the second trigger will extract all the data linked to the added or updated document. And then the third trigger is used to store the data in the SQL database in the corresponding table. And the last one will just stop the recipe after that.

      So the following video will also summarize how the recipe works. So let's start first by importing a new file on ACC. Then we'll open Workato and launch the recipe. And you will see that the recipe will automatically detect the new file and extract its data.

      So you can see here that the file has been detected automatically. And once the recipe has been completed, we can check in the database whether the information has been uploaded successfully or not. And you can see also here that we can find this same file here with all the corresponding data already extracted. So we have just finished now with the ACC Docs. Let's move now to ACC Build.

      The aim of this third recipe, let's say, is to extract from ACC Build all the data in order to track the drawing delivery. So as for the previous recipes, the data will be stored into a SQL database and also powered by Power BI after that. And to extract the information from ACC Build, we'll use a recipe based mainly on a customized trigger that we have created based on the ACC API, this one.

      And this trigger is based on this GET request that we can check also on the APS platform right now. So this is the GET request that we have used. And here you can usually find a small description about the GET request used.

      So the recipe also contains another trigger which you may have seen in previous recipes also, but we will take a closer look at it here. It's a SQL server trigger, this one, that inserts a row into a table in the SQL database. It's a trigger that is already existing by default on the Workato.

      And as you can see here on the top right, this trigger will connect to the database using the connection previously created to access our SQL server. Additionally, of creating an ACC connector before starting this recipe, we have also created a SQL server connector. And once connected to our database, it will allow us to choose the table and the correspondence between the table columns and the ACC data to be inserted on.

      Always, as with the other recipes, the final results can be visualized on a Power BI via dashboard. That is given some indicators on construction drawings that we can see directly from here. So we can have information about the number of construction drawings that were delivered. We can also filter them by discipline or by date of creation also or by building. It depends on the project also. And here, into the list we can have more information, such as the version of the drawing, et cetera.

      Moving on now to the last recipe for extracting data from ACC Takeoff. The aim of this recipe is to extract all the data from the Takeoff inventory. And to achieve this, there are also a few points to bear in mind before starting the recipe that you can find also in the handout report in detail. So it's important before starting this recipe to create packages on ACC Takeoff with a maximum of 10,000 objects per package.

      And we have also to follow a classification and to integrate a classification system into the models before starting the takeoff. And the last point, which is very important also is you need to avoid having duplicate families with different names. And after that, so we can start our recipe, in order to extract all the information properly we will need to follow a specific sequence.

      We need to start by extracting all packaged data. Then we'll extract type data for each package and then finally the items data for each type. So we are going to build a sort of linked recipes, so this recipe four, will involve two other integrated recipes-- the first one to extract package information and the second one to extract type information. And the principal one, which is recipe four, will extract items information based on those two recipes.

      So as I have just said, to extract information from all the objects in a model, we'll need to start by extracting the packages information. Packages are folders created by the user, as you can see in the top right-hand corner, which allow objects to be classified by any desired category. And to do so, we are going to create a recipe that will be called by the main recipe for later. We'll see this later. And it will use a main trigger to extract the package's information. This trigger was also created based on the ACC API and more precisely using the following GET request-- this one that you will find also in the APS platform.

      And once we have extracted all the information that's related to the packages, we'll need to extract now the types information created into each package. To do this, we are going to create a second recipe, which will be also called by the main recipe later, and we'll see this. And which will allow us to extract all the type data created in each package. So this recipe had a main trigger, which is used to extract data from types. It's also a custom ACC trigger also based on the ACC API, more precisely by using this GET request.

      So moving on now to the most important part, which is the creation of the main recipe for extracting the data linked to all the elements in the model. This recipe will be made up of two parts. The first will enable us to call up the two recipes we have just created-- the first one to extract information from packages and the second one to extract type information.

      And once the information about the packages and types containing the model elements is available, we can now start extracting the data for the model objects. So the object and the model are listed in the inventory of the ACC Takeoff module by types and packages, as you can see in the image at the top right. And to do so-- to do the extraction, we're going to use a custom ACC trigger also that is based on ACC API and more precisely, this GET request.

      So now, as with all the other recipes, the information can be used with Power BI more easily. And we can see different indicators, like those ones that I will share with you right now. Number of objects into each deliverable. We can filter, for example, per package. And we can also filter per type object. And here, we can have a total of volume per type and also per item. We can have it also per package. So it will deliver too many important information about quantities directly accessible from the Power BI.

      So we are coming to the end of this class. That's all for me. I hope I haven't been too boring. And I'm going to give the floor to Arnold so that he can end this class with a conclusion.

      ARNOLD LEDAN: Yes, thank you. Thank you, Hafsa. So for the conclusion, so we understand ACC Connect allows you to automate many process, and also ACC Connect now allow us to update in the dashboard in a minimum time to respond as a customer request. Thank you very much for your attention. Thank you.

      ______
      icon-svg-close-thick

      Cookie 首选项

      您的隐私对我们非常重要,为您提供出色的体验是我们的责任。为了帮助自定义信息和构建应用程序,我们会收集有关您如何使用此站点的数据。

      我们是否可以收集并使用您的数据?

      详细了解我们使用的第三方服务以及我们的隐私声明

      绝对必要 – 我们的网站正常运行并为您提供服务所必需的

      通过这些 Cookie,我们可以记录您的偏好或登录信息,响应您的请求或完成购物车中物品或服务的订购。

      改善您的体验 – 使我们能够为您展示与您相关的内容

      通过这些 Cookie,我们可以提供增强的功能和个性化服务。可能由我们或第三方提供商进行设置,我们会利用其服务为您提供定制的信息和体验。如果您不允许使用这些 Cookie,可能会无法使用某些或全部服务。

      定制您的广告 – 允许我们为您提供针对性的广告

      这些 Cookie 会根据您的活动和兴趣收集有关您的数据,以便向您显示相关广告并跟踪其效果。通过收集这些数据,我们可以更有针对性地向您显示与您的兴趣相关的广告。如果您不允许使用这些 Cookie,您看到的广告将缺乏针对性。

      icon-svg-close-thick

      第三方服务

      详细了解每个类别中我们所用的第三方服务,以及我们如何使用所收集的与您的网络活动相关的数据。

      icon-svg-hide-thick

      icon-svg-show-thick

      绝对必要 – 我们的网站正常运行并为您提供服务所必需的

      Qualtrics
      我们通过 Qualtrics 借助调查或联机表单获得您的反馈。您可能会被随机选定参与某项调查,或者您可以主动向我们提供反馈。填写调查之前,我们将收集数据以更好地了解您所执行的操作。这有助于我们解决您可能遇到的问题。. Qualtrics 隐私政策
      Akamai mPulse
      我们通过 Akamai mPulse 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Akamai mPulse 隐私政策
      Digital River
      我们通过 Digital River 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Digital River 隐私政策
      Dynatrace
      我们通过 Dynatrace 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Dynatrace 隐私政策
      Khoros
      我们通过 Khoros 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Khoros 隐私政策
      Launch Darkly
      我们通过 Launch Darkly 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Launch Darkly 隐私政策
      New Relic
      我们通过 New Relic 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. New Relic 隐私政策
      Salesforce Live Agent
      我们通过 Salesforce Live Agent 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Salesforce Live Agent 隐私政策
      Wistia
      我们通过 Wistia 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Wistia 隐私政策
      Tealium
      我们通过 Tealium 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Tealium 隐私政策
      Upsellit
      我们通过 Upsellit 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Upsellit 隐私政策
      CJ Affiliates
      我们通过 CJ Affiliates 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. CJ Affiliates 隐私政策
      Commission Factory
      我们通过 Commission Factory 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Commission Factory 隐私政策
      Google Analytics (Strictly Necessary)
      我们通过 Google Analytics (Strictly Necessary) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Strictly Necessary) 隐私政策
      Typepad Stats
      我们通过 Typepad Stats 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Typepad Stats 隐私政策
      Geo Targetly
      我们使用 Geo Targetly 将网站访问者引导至最合适的网页并/或根据他们的位置提供量身定制的内容。 Geo Targetly 使用网站访问者的 IP 地址确定访问者设备的大致位置。 这有助于确保访问者以其(最有可能的)本地语言浏览内容。Geo Targetly 隐私政策
      SpeedCurve
      我们使用 SpeedCurve 来监控和衡量您的网站体验的性能,具体因素为网页加载时间以及后续元素(如图像、脚本和文本)的响应能力。SpeedCurve 隐私政策
      Qualified
      Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

      icon-svg-hide-thick

      icon-svg-show-thick

      改善您的体验 – 使我们能够为您展示与您相关的内容

      Google Optimize
      我们通过 Google Optimize 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Google Optimize 隐私政策
      ClickTale
      我们通过 ClickTale 更好地了解您可能会在站点的哪些方面遇到困难。我们通过会话记录来帮助了解您与站点的交互方式,包括页面上的各种元素。将隐藏可能会识别个人身份的信息,而不会收集此信息。. ClickTale 隐私政策
      OneSignal
      我们通过 OneSignal 在 OneSignal 提供支持的站点上投放数字广告。根据 OneSignal 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 OneSignal 收集的与您相关的数据相整合。我们利用发送给 OneSignal 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. OneSignal 隐私政策
      Optimizely
      我们通过 Optimizely 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Optimizely 隐私政策
      Amplitude
      我们通过 Amplitude 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Amplitude 隐私政策
      Snowplow
      我们通过 Snowplow 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Snowplow 隐私政策
      UserVoice
      我们通过 UserVoice 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. UserVoice 隐私政策
      Clearbit
      Clearbit 允许实时数据扩充,为客户提供个性化且相关的体验。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。Clearbit 隐私政策
      YouTube
      YouTube 是一个视频共享平台,允许用户在我们的网站上查看和共享嵌入视频。YouTube 提供关于视频性能的观看指标。 YouTube 隐私政策

      icon-svg-hide-thick

      icon-svg-show-thick

      定制您的广告 – 允许我们为您提供针对性的广告

      Adobe Analytics
      我们通过 Adobe Analytics 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Adobe Analytics 隐私政策
      Google Analytics (Web Analytics)
      我们通过 Google Analytics (Web Analytics) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Web Analytics) 隐私政策
      AdWords
      我们通过 AdWords 在 AdWords 提供支持的站点上投放数字广告。根据 AdWords 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AdWords 收集的与您相关的数据相整合。我们利用发送给 AdWords 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AdWords 隐私政策
      Marketo
      我们通过 Marketo 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。我们可能会将此数据与从其他信息源收集的数据相整合,以根据高级分析处理方法向您提供改进的销售体验或客户服务体验以及更相关的内容。. Marketo 隐私政策
      Doubleclick
      我们通过 Doubleclick 在 Doubleclick 提供支持的站点上投放数字广告。根据 Doubleclick 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Doubleclick 收集的与您相关的数据相整合。我们利用发送给 Doubleclick 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Doubleclick 隐私政策
      HubSpot
      我们通过 HubSpot 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。. HubSpot 隐私政策
      Twitter
      我们通过 Twitter 在 Twitter 提供支持的站点上投放数字广告。根据 Twitter 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Twitter 收集的与您相关的数据相整合。我们利用发送给 Twitter 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Twitter 隐私政策
      Facebook
      我们通过 Facebook 在 Facebook 提供支持的站点上投放数字广告。根据 Facebook 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Facebook 收集的与您相关的数据相整合。我们利用发送给 Facebook 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Facebook 隐私政策
      LinkedIn
      我们通过 LinkedIn 在 LinkedIn 提供支持的站点上投放数字广告。根据 LinkedIn 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 LinkedIn 收集的与您相关的数据相整合。我们利用发送给 LinkedIn 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. LinkedIn 隐私政策
      Yahoo! Japan
      我们通过 Yahoo! Japan 在 Yahoo! Japan 提供支持的站点上投放数字广告。根据 Yahoo! Japan 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Yahoo! Japan 收集的与您相关的数据相整合。我们利用发送给 Yahoo! Japan 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Yahoo! Japan 隐私政策
      Naver
      我们通过 Naver 在 Naver 提供支持的站点上投放数字广告。根据 Naver 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Naver 收集的与您相关的数据相整合。我们利用发送给 Naver 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Naver 隐私政策
      Quantcast
      我们通过 Quantcast 在 Quantcast 提供支持的站点上投放数字广告。根据 Quantcast 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Quantcast 收集的与您相关的数据相整合。我们利用发送给 Quantcast 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Quantcast 隐私政策
      Call Tracking
      我们通过 Call Tracking 为推广活动提供专属的电话号码。从而,使您可以更快地联系我们的支持人员并帮助我们更精确地评估我们的表现。我们可能会通过提供的电话号码收集与您在站点中的活动相关的数据。. Call Tracking 隐私政策
      Wunderkind
      我们通过 Wunderkind 在 Wunderkind 提供支持的站点上投放数字广告。根据 Wunderkind 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Wunderkind 收集的与您相关的数据相整合。我们利用发送给 Wunderkind 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Wunderkind 隐私政策
      ADC Media
      我们通过 ADC Media 在 ADC Media 提供支持的站点上投放数字广告。根据 ADC Media 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 ADC Media 收集的与您相关的数据相整合。我们利用发送给 ADC Media 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. ADC Media 隐私政策
      AgrantSEM
      我们通过 AgrantSEM 在 AgrantSEM 提供支持的站点上投放数字广告。根据 AgrantSEM 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AgrantSEM 收集的与您相关的数据相整合。我们利用发送给 AgrantSEM 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AgrantSEM 隐私政策
      Bidtellect
      我们通过 Bidtellect 在 Bidtellect 提供支持的站点上投放数字广告。根据 Bidtellect 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bidtellect 收集的与您相关的数据相整合。我们利用发送给 Bidtellect 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bidtellect 隐私政策
      Bing
      我们通过 Bing 在 Bing 提供支持的站点上投放数字广告。根据 Bing 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bing 收集的与您相关的数据相整合。我们利用发送给 Bing 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bing 隐私政策
      G2Crowd
      我们通过 G2Crowd 在 G2Crowd 提供支持的站点上投放数字广告。根据 G2Crowd 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 G2Crowd 收集的与您相关的数据相整合。我们利用发送给 G2Crowd 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. G2Crowd 隐私政策
      NMPI Display
      我们通过 NMPI Display 在 NMPI Display 提供支持的站点上投放数字广告。根据 NMPI Display 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 NMPI Display 收集的与您相关的数据相整合。我们利用发送给 NMPI Display 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. NMPI Display 隐私政策
      VK
      我们通过 VK 在 VK 提供支持的站点上投放数字广告。根据 VK 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 VK 收集的与您相关的数据相整合。我们利用发送给 VK 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. VK 隐私政策
      Adobe Target
      我们通过 Adobe Target 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Adobe Target 隐私政策
      Google Analytics (Advertising)
      我们通过 Google Analytics (Advertising) 在 Google Analytics (Advertising) 提供支持的站点上投放数字广告。根据 Google Analytics (Advertising) 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Google Analytics (Advertising) 收集的与您相关的数据相整合。我们利用发送给 Google Analytics (Advertising) 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Google Analytics (Advertising) 隐私政策
      Trendkite
      我们通过 Trendkite 在 Trendkite 提供支持的站点上投放数字广告。根据 Trendkite 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Trendkite 收集的与您相关的数据相整合。我们利用发送给 Trendkite 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Trendkite 隐私政策
      Hotjar
      我们通过 Hotjar 在 Hotjar 提供支持的站点上投放数字广告。根据 Hotjar 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Hotjar 收集的与您相关的数据相整合。我们利用发送给 Hotjar 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Hotjar 隐私政策
      6 Sense
      我们通过 6 Sense 在 6 Sense 提供支持的站点上投放数字广告。根据 6 Sense 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 6 Sense 收集的与您相关的数据相整合。我们利用发送给 6 Sense 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. 6 Sense 隐私政策
      Terminus
      我们通过 Terminus 在 Terminus 提供支持的站点上投放数字广告。根据 Terminus 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Terminus 收集的与您相关的数据相整合。我们利用发送给 Terminus 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Terminus 隐私政策
      StackAdapt
      我们通过 StackAdapt 在 StackAdapt 提供支持的站点上投放数字广告。根据 StackAdapt 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 StackAdapt 收集的与您相关的数据相整合。我们利用发送给 StackAdapt 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. StackAdapt 隐私政策
      The Trade Desk
      我们通过 The Trade Desk 在 The Trade Desk 提供支持的站点上投放数字广告。根据 The Trade Desk 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 The Trade Desk 收集的与您相关的数据相整合。我们利用发送给 The Trade Desk 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. The Trade Desk 隐私政策
      RollWorks
      We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

      是否确定要简化联机体验?

      我们希望您能够从我们这里获得良好体验。对于上一屏幕中的类别,如果选择“是”,我们将收集并使用您的数据以自定义您的体验并为您构建更好的应用程序。您可以访问我们的“隐私声明”,根据需要更改您的设置。

      个性化您的体验,选择由您来做。

      我们重视隐私权。我们收集的数据可以帮助我们了解您对我们产品的使用情况、您可能感兴趣的信息以及我们可以在哪些方面做出改善以使您与 Autodesk 的沟通更为顺畅。

      我们是否可以收集并使用您的数据,从而为您打造个性化的体验?

      通过管理您在此站点的隐私设置来了解个性化体验的好处,或访问我们的隐私声明详细了解您的可用选项。