AU Class
AU Class
class - AU

Single Source of Truth and Data Consistency Using Autodesk Construction Cloud, Revit, and API

共享此课程
在视频、演示文稿幻灯片和讲义中搜索关键字:

说明

What is a single source of truth and why it is important in the architecture, engineering, and construction (AEC) industry? In this class, we’ll learn the best practices of maximizing Autodesk Construction Cloud, Autodesk BIM Collaborate Pro, Revit software, and other various tools to create and maintain a single source. We’ll also go beyond the usual single source of truth implementation by exploring the innovative data-focused approach to ensure data consistency and compliance achieved by combining the cloud, central project requirements repository, model authoring tools, and API automation capabilities. Regardless of whether you’re involved in project planning, execution, monitoring, or closing, you’ll find multiple interesting ideas, workflows, and implementation examples of using the latest technology to enhance your capabilities and achieve success.

主要学习内容

  • Learn about single source of truth, its associated challenges, and the benefits of successful implementation.
  • Learn how to implement best practices to create and maintain single source of truth using Autodesk Docs, Autodesk BIM Collaborate Pro, and Revit.
  • Go beyond the usual single source of truth implementation by exploring the innovative data-compliance-focused approach.
  • Learn about harnessing Autodesk Construction Cloud, Revit, APIs, and other tools to ensure data consistency and prevent BIM execution plan incompliances.

讲师

  • Mateusz Lukasiewicz 的头像
    Mateusz Lukasiewicz
    Mateusz Lukasiewicz has over 12 years of experience in the AEC industry, and throughout his career, he successfully led digital delivery of large-scale projects and developed a number of modern digital engineering solutions by combining BIM expertise, computer programming skills and project management principles. Mateusz undertakes a vital role in driving company's clear vision towards achieving the leading digital innovator position in the market and its long-term digital capability goals.
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
  • Chapters
  • descriptions off, selected
  • subtitles off, selected
      Transcript

      MATEUSZ LUKASIEWICZ: Hi, everyone. Thank you for attending Autodesk University 2022, and welcome to my class about Single Source of Truth and Data Consistency using Autodesk Construction Cloud, Revit, and API.

      I will start with introducing myself. My name is Mateusz Lukasiewicz. I'm Digital Projects Analyst at KEO International Consultants based in Dubai, United Arab Emirates. I have over 12 years of experience delivering large scale projects using BIM, computer programming, and project management principles.

      Session introduction. This is the official description and learning objectives which also formulate the agenda for today's session.

      We start with understanding definition, importance, challenges, and benefits of a single source of truth. Let's start with asking question, why do we talk about seeing a source of truth and data consistency? Based on Data Advantage in Construction Report released by Autodesk in 2021, it is estimated over $1.84 trillion bad data cost in construction in 2020. The number is estimate only. However, it shows that the construction industry is facing a significant problem with data quality.

      Now as we understand the problem and the reason why data quality and single source of truth are important, let's talk about definitions. I have identified two main descriptions that can help us in understanding the concept. First one, in plain language, it means same approved data used by all stakeholders to make informed decisions. And this is why we implement ISO-19650 standards working practice shared and published areas in common data environment to control data exchange.

      Second technical definition can be summarized by ability to update by reference, and this is where a single update affects all references to the object. For example, change applied in Revit model element, let's say world thickness, affects associated views, schedules, volumetric data, and tags.

      Knowing definition, let's move to benefits, qualities, and challenges. Following benefits can be observed while using single source of truth-- reduce mistakes, errors, and rework; meet schedules and budgets; improve communication and quality; achieve better decision making; improve trust and transparency; and reduce the risk of actions using wrong data. In terms of qualities which also poses challenges, single source of truth should be accessible, up to date, standardized, and integrated with other systems. Also, the most important, it should be trusted by the project team.

      We discussed the basic theory behind single source of truth. Now let's move to practical implementation. In next few slides, I will talk about out of the box Autodesk Construction Cloud and Revit functionality to create and maintain a single source of truth. This section contains rather entry level information. However, these foundations will be essential for custom solutions using Revit and Forge APIs that will be shown later on.

      Use Autodesk Construction Cloud as common data environment and apply ISO-19650 folder structure. Cloud based common data environment correctly created for the structure and applied access permissions are the key components of collaboration platform.

      Worksharing central and local models. Worksharing functionality allows teams to work on the same model at the same time from any location. On the image on the right, we can see the relation between central and local files. Upper left image shows published central model in Autodesk Construction Cloud, and below image shows the local model cache and how models UID can be obtained via Forge API, which will be used later on in one of the examples.

      Using BIM Collaborate Pro and Design Collaboration module is a great way to share, consume, and track data exchanges between design teams. It ensures transparency and streamlines work in progress to shared area data transfer within the common data environment.

      On this slide, I would like to highlight two points. First one is project requirements that should be stored in the common location accessible by the project team in Autodesk Docs. And by project requirements, I mean BIM execution plan, shared parameter files, and project BIM configuration structure data. This is the concept that will be explored later on.

      The second thing is Parameter Service. This is the brand new tool available as tech preview in Revit 2023, and this is used to store shared parameters in cloud, rather than in text shared parameter file.

      Autodesk Construction Cloud offers multiple tools that can be used to support common project activities. Implementing available tools enable teams to streamline processes, improve the quality and transparency, and also to minimize work outside common data environment. You can see on the slide, there are a number of tools that can be used by the project team, and each of this tool can be a topic for the separate session. If you are interested in getting more information, please refer to the session handout for learning resources references.

      Previously, we learned the foundations of using Autodesk Construction Cloud for creating and maintaining a single source of truth. Now we are ready to explore more advanced methods.

      First, let's have a quick look on the most common tools interacting in Autodesk Construction Cloud and Revit. The most common are Revit, Dynamo, Autodesk Forge, and Power Automate. All of these tools can be used to automate processes and manipulate the data in Revit and in Autodesk Construction Cloud.

      Typically, while speaking about automation tools for Revit, we imagine scripts running inside authoring software after hitting the button. I would like to highlight alternate ways of executing scripts and the way it interacts with users.

      In terms of environment, those can be done outside model altering software, execution may be scheduled or happen on triggered event, and from a user perspective, user can be alerted on noncompliance or be unaware of process running in the background.

      Before we move to showcase, I'm going to introduce a few techniques, methodologies, and concepts that will be used in practical examples. First, event driven programming, which allows interaction with data in response to user activity. We have 70 plus available events in Revit API. The second is the structured data project requirements.

      We are using organized structured data to set as an input to automation and validation tools. It will be also shown later on how to create the structure of project requirements. And the third one is functionality blocking. In certain cases, the quality can be achieved by blocking unwanted functionality in the project.

      Finally, we are moving to the most interesting part of presentation, to see the power of how Autodesk Construction Cloud, Revit, and APIs in action, and how the platform and concepts introduced earlier can be used to ensure that the consistency and prevent BIM execution plan incompliances.

      The first short video will go through setup required to run all the tools. You can see Autodesk Construction Cloud project with project requirements stored in the common location. ISO-196050 compliant folder structure, we can observe work in progress certain published areas within the CD.

      Additional services activated for project members, such as design collaboration and model coordination. Design collaboration teams created for data exchange. Created Forge application and registered in account admin. You can also see some custom Revit plugins installed, and also Autodesk Desktop Connector.

      I'm showing now BIM execution plan. This is the standard document that is the key for each project delivery. And the takeaway of this part of the video is that we would like to replace unstructured data in BIM execution plan with reference to structure project requirements database. And in simplest form, this database can be a spreadsheet stored in Autodesk Construction Cloud. However, it may be other solution of your choice.

      So in this particular example, we are trying to replace this image of project location base point and SharePoint-- SharePoint, with references to a spreadsheet that contains data in more structured format as this data will serve as the input for various tools.

      So this is the example of structured data requirements that were captured in project configuration file. It may contain any data that was agreed with the client and that needs to be validated throughout the project delivery. Can now see project location-- project information requirements data, list of models, sheets, project location, naming system requirements, required parameters.

      And again, this is just an example for the sake of the demonstration. Obviously on the large projects, this table would be way, way longer.

      One thing that I would like to highlight on this slide is the data restriction column, which basically specifies what are the expected and allowed values for certain data. So we can see range values for sound transmission class, and also specific values for fire rating.

      In a similar manner, we can capture any other data. And in addition, we can capture not only the project requirements, but also this file can serve as optimization tools configuration files, so we want to control the behavior of the tools based on the settings specified in this file.

      We can see project requirements structured data stored in the common location. This is the backbone of our project delivery, this is the data that is accessible to the entire project team, and this is the single source of truth of project requirements that has to be fulfilled by the team.

      The last part shows shared parameters loaded in the project. We can use either the Revit parameter service, Revit 2023, or shared parameters text file.

      Once the setup is completed, we can now move to examples of using Revit API.

      We start with functionality blocking. During this demonstration, we would like to override a CAD import command to prevent bad modeling practices, and also validate the Revit family data source to block any non-compliant content.

      In the first part of the video, I'm demonstrating that the behavior of scripts can be controlled from project configuration file. Basically what's happening now, I'm specifying that the family content can be loaded only from approved KEO content library, and also that CAD import command should be blocked.

      I'm saving this file. You can see that everything is happening directly in Autodesk Construction Cloud. We saw an updated version of document overwritten in the platform, and now we are attempting to import CAD.

      We can now observe that once user prompt to import CAD, there is a notification that prevents user from using this command, which is the expected behavior as we want to block this functionality as it might be not allowed by the client.

      In the second example, we are navigating to a folder that contains non-compliant families-- so basically, we are trying to load Revit family from outside KEO approved library. And again, in this case, we received a notification that the family loading is cancelled, and the reason for that is incorrect source location. So by using these tools, we can easily control the content in the model and also prevent any bad modeling practices, such as using import CAD command.

      In the second example, we'll talk about metrics export on model closing event, how to store data, and how to visualize the data.

      On the screen, I'm showing metrics dashboard that is embedded directly in Autodesk Construction Cloud Insight module. In this case, this is the model of model health dashboard, and this is based on the data extracted from cloud models based on the benchmarks and factors specified in the project configuration file.

      On the right part of the screen, I'm showing metrics that are stored in Autodesk Construction Cloud, and this is important because the data that is displayed in Power BI dashboard comes from ACC. This is not offline data from someone's local machine.

      Now I'm going to attempt to close the model, and we received the notification that model health metrics have been exported successfully. Basically, the way how it works, there is an event that triggers to export the data from Revit. Can now observe the data override in Autodesk Construction Cloud. Based on approved data, we are able to refresh data source, publish Power BI dashboard, and send directly in the ACC.

      We can now see the updated dashboard.

      Another note on model health metrics, we may not want to wait until models are closed or someone refresh the data source in Power BI and publish the report. In some cases, we would like to see the live feedback. So there is actually functionality to display data live directly in Revit, not only for currently open document, but also for all documents, including linked files.

      The next example will be about parameters and naming compliance on sync event.

      I'm navigating to configuration file to show expected naming convention and parameters requirements. We can now observe that the family dynamic system is well defined, and also it's currently-- it's filed in the currently open model, and the same for the project requirements. In this and few other examples, we'll focus on sound transmission class and fire rating, so we can stop for a second look on the expected values for fire rating and sound transmission class. We can see that the requirements have followed the model. Now we are attempting to synchronize the model, which is happening without errors, and this is sort of expected behavior, to be able to synchronize the correct model.

      Now I'm going to simulate applying some incorrect changes in the model, so I'm changing the values outside the given ranges or acceptable values as I'm changing one of the approved families to one of the non-compliant families.

      Now we're trying to synchronize the model. We'll actually receive the notification that there is noncompliance. It provides detailed information about the non-compliant element, including element ID and non-compliant parameter values. So this is the first example for the family that is not compliant in terms of parameters data. And the second example of family naming in compliance for the family that was loaded outside KEO family library.

      The interesting part about this example is not only the notification shown to the user about the incompliance, but also automatically we are cancelling the sync operation. So basically, we are preventing users to synchronize non-compliant information to the central model. And this model will not be able to be synchronized with central file unless we fixed these two errors. And again, just side note that this is only a small demonstration about these two parameters plus naming system compliance, but obviously we can validate any other data that is defined in project configuration file.

      I'm now applying corrective actions in order to synchronize the model.

      So this was the example of model validation at the sync event, and we can now extract data through this event. So if we go back to the previous example, when data metrics were extracted at model close, we can also do it at model sync. So we will have more updated data exported daily.

      The next example is about model updates. We'll explore three ways of updating the models and ensuring the compliance.

      We start with the manual one, after a user is alerted. This is very similar example to the previous one. In this case, we will focus on project information details. Again, we can see the notification that there are certain compliances in the model. We can see the requirements on the left side and the actual data in the model, which we are now correcting. So you can see that although it's very helpful to receive this notification, still it's a manual work to update the model.

      So is there a better way to update the models? The answer is yes. We can automatically update models at sync event. And this is the silent update. It may occur without user knowledge about the wrong data in the model and the update that was done in the background.

      We saw incorrect data in the model, we are now synchronizing the model. So the difference between this case and the previous one, we are not receiving any notification about incompliance. Rather, there is a transaction that is triggered during the sync event to update the data if it's incorrect. We can now see that data was automatically updated and the model is compliant with requirements.

      So this is a very powerful mechanism because if we imagine that, for example, there is some last minute change in maybe title block details, maybe revisions or any other data, we are able to automatically update this data without someone manually checking each model separately.

      However, there is one downside of the previous example, and the downside is the need of opening and synchronizing each model individually. And we know that in the large projects, there might be hundreds, sometimes thousands of models, and this is why we would like to batch update multiple models. So this is basically the third way of updating models.

      Again, we are simulating some changes in product information requirements. Now we have a lot of things going on the screen. On the top part, we can see three models-- so this is architectural, structural, and mechanical models that will be updated automatically. And in the Revit window, we can see one of the models, and we can see that it contains incorrect data. These three models will be updated.

      I'm closing the model only for the sake of demonstrating that this tool can be run from a model that is-- from the model that is open, not from the project that we are trying to update the files. So I just opened some sample test model, and you can see that in the application dialogue we are able to select Project and one or more models that you would like to update. In this case, I selected all models, which are the three files.

      Now we can see that each model is opened, it's updated-- it's not really shown on the screen, but it's updated, saved, and most importantly, it's synchronized with central. In addition, we see that there was some extra data exported for each model.

      So if you think again about the model health metrics, we don't have to close the models to export the data, we don't have to synchronize the models to get the data, we can use this tool to not only apply the update in the model, but also to extract latest metrics, or, for example, to export Navisworks model, which is a quite common task.

      We'll now receive notification that three models have been updated and synchronized with central. I'm going to refresh ACC to show that there was some additional text file containing data extracted.

      Now we are opening one of the models to demonstrate that there was change applied automatically. And again, this concept can be used for way more than project information. We can create sheets, we can place the views on sheets, we can have a title block, place families, do some validation, export metrics or models in different format. So this was the third and the last way of updating the models using Revit API.

      The last video will demonstrate potential of Forge API.

      In first example, we'll talk about Autodesk Construction Cloud and SharePoint project consistency.

      Now it's time for some background. It's quite common for organizations to use Autodesk Construction Cloud along other collaboration platforms, such as SharePoint, which normally is used to facilitate various internal processes within the companies, and one of the challenges that companies are facing is to ensure data consistency between two or more platforms. So in this case, we'll look on project data in SharePoint and Autodesk Construction Cloud, and the table shown on the screen shows possible compliant and non-compliant scenarios of entities in Autodesk Construction Cloud and SharePoint.

      This is the example of SharePoint projects list. Power Automate cloud flow to extract data from SharePoint. List of ACC projects. We can see that certain products are excluded using built-in parameter, and this is the example of some testing projects or template projects. We don't want this project to appear in any official project list in internal reports within the organization. Custom force integration.

      And now I'm seeing very minimalistic Forge application. All the tools presented today will be executed from terminal, but obviously we can build some web application and really work on the user interface to make it more user friendly.

      You can see that in a matter of few seconds, we obtain the full results about the non-compliant and compliant projects. We can improve credibility and show it in Power BI dashboard. And based on the results, we are able to identify non-compliant projects and apply corrective actions-- so this might be adding projects to SharePoint or BIM 360, removing archiving, and so on.

      So one of the examples, how it was very useful is if the company is controlling project access by forms which are based on some data in SharePoint, the companies can identify if there are any projects in BIM 360 or Autodesk Construction Cloud that cannot be accessed because this project doesn't appear on SharePoint list, or vise versa, they appear in a SharePoint list, but they are not created in BIM 360.

      In the second example, we talk about the users management, and we would like to add multiple users to ACC project based on project configuration file.

      On the left, you can see product users with specified company and role. On the right side, the actual members of ACC project. So the list on the left, it very often is provided by, for example, by project manager. The manager is receiving long list of users, he's trying to add them manually, so this process can be automated via Forge.

      Again, we are opening our Forge application, and in a matter of a few seconds we are able to match up users to BIM 360 and Autodesk Construction Cloud project.

      We can now see full compliance between platform and the project users list.

      In the next example, we'll talk about task information delivery plan validation, and in simple words, we'll be tracking models and submission progress.

      We are now looking at the models and sheets requirements. Very often, there is a separate document on the projects called task information delivery plan, which specifies responsibilities and the contractual deliverables. We are showing this two tables and we'll simulate a sample milestone submission on the project. So all these files are expected to be submitted by the project delivery team.

      We can now see created folder in published area in common data environment, we can see multiple files uploaded by the team, and we would like to identify what is the submission status as of this moment.

      We are now looking on the Submission Progress Dashboard. That is based on the data extracted directly in Autodesk Construction Cloud. And this dashboard allows us to identify actionable items for the file that has to be either added or deleted in ACC or TIPP.

      We are going back to our Forge application. But before that, we are simulating corrective actions. So basically, we are uploading missing documents reviewing TIPP based on the dashboard results. And now we would like to rerun the application to obtain latest data.

      We can now see updated data uploaded in BIM 360.

      And now we can refresh our BI dashboard.

      Now based on the updated dashboard, we are 100% confident of fulfilling milestone deliverables. There is obviously alternate way of achieving similar results. We can extract a file log from ACC, we can copy it to Excel or ransom formulas and obtain similar result. However, this is, I believe, a smarter way and more automated.

      The very last example in this presentation is the naming and parameters validation. So we would like to validate parameters and naming compliance for multiple models without opening Revit.

      First, I'm going to demonstrate Forge API functionality and obtaining model content properties without opening Revit. We see now on the screen we are browsing to the project content, and you can see a lot of data related to Revit elements inside the model. You can see all the identity data, also our custom parameters that would be verified later on. The sound transmission class and fire rating.

      Knowing that we can obtain data outside the Revit environment, we'll now validate two additional models that are uploaded in ACC by SC1 organization. So this might be one of our consultants that is not using this fancy tool that we demonstrated earlier on, but we still want to validate the data compliance for these two models by not necessarily opening these models and running any of our desktop based tools.

      I'm just opening these two models just to show that there is indeed non-compliant data, but I'm going to close this file as it's not required to have Revit open. We don't even have to have a Revit license or Revit installed on your PC to run this tool.

      We can now see that there are two models being validated. And in a matter of a few seconds, without opening Revit, we are able to validate naming system and parameters compliance.

      So this is already good data. We can share this feedback with our sub consultant to inform them about the high level results of compliance status. But in addition, we can also expand the results further to show more details about non-compliant and compliant content, and this data will include elements, IDs, and the results for each parameter or metric that was checked.

      And again, this is just the data displayed in visual coterminal window, but this data can be exported to any other text file format, and using dashboards or reports we can easily inform the project team about any compliance issues in the file.

      So this concludes my presentation about single source of truth and data consistency using Autodesk Construction Cloud, Revit, and API. I hope you found the content insightful and inspiring. I also encourage you to have a look on the session handout for more information. Thank you for attending this class, and feel free to contact me if you have any questions.

      ______
      icon-svg-close-thick

      Cookie 首选项

      您的隐私对我们非常重要,为您提供出色的体验是我们的责任。为了帮助自定义信息和构建应用程序,我们会收集有关您如何使用此站点的数据。

      我们是否可以收集并使用您的数据?

      详细了解我们使用的第三方服务以及我们的隐私声明

      绝对必要 – 我们的网站正常运行并为您提供服务所必需的

      通过这些 Cookie,我们可以记录您的偏好或登录信息,响应您的请求或完成购物车中物品或服务的订购。

      改善您的体验 – 使我们能够为您展示与您相关的内容

      通过这些 Cookie,我们可以提供增强的功能和个性化服务。可能由我们或第三方提供商进行设置,我们会利用其服务为您提供定制的信息和体验。如果您不允许使用这些 Cookie,可能会无法使用某些或全部服务。

      定制您的广告 – 允许我们为您提供针对性的广告

      这些 Cookie 会根据您的活动和兴趣收集有关您的数据,以便向您显示相关广告并跟踪其效果。通过收集这些数据,我们可以更有针对性地向您显示与您的兴趣相关的广告。如果您不允许使用这些 Cookie,您看到的广告将缺乏针对性。

      icon-svg-close-thick

      第三方服务

      详细了解每个类别中我们所用的第三方服务,以及我们如何使用所收集的与您的网络活动相关的数据。

      icon-svg-hide-thick

      icon-svg-show-thick

      绝对必要 – 我们的网站正常运行并为您提供服务所必需的

      Qualtrics
      我们通过 Qualtrics 借助调查或联机表单获得您的反馈。您可能会被随机选定参与某项调查,或者您可以主动向我们提供反馈。填写调查之前,我们将收集数据以更好地了解您所执行的操作。这有助于我们解决您可能遇到的问题。. Qualtrics 隐私政策
      Akamai mPulse
      我们通过 Akamai mPulse 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Akamai mPulse 隐私政策
      Digital River
      我们通过 Digital River 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Digital River 隐私政策
      Dynatrace
      我们通过 Dynatrace 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Dynatrace 隐私政策
      Khoros
      我们通过 Khoros 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Khoros 隐私政策
      Launch Darkly
      我们通过 Launch Darkly 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Launch Darkly 隐私政策
      New Relic
      我们通过 New Relic 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. New Relic 隐私政策
      Salesforce Live Agent
      我们通过 Salesforce Live Agent 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Salesforce Live Agent 隐私政策
      Wistia
      我们通过 Wistia 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Wistia 隐私政策
      Tealium
      我们通过 Tealium 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Tealium 隐私政策
      Upsellit
      我们通过 Upsellit 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Upsellit 隐私政策
      CJ Affiliates
      我们通过 CJ Affiliates 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. CJ Affiliates 隐私政策
      Commission Factory
      我们通过 Commission Factory 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Commission Factory 隐私政策
      Google Analytics (Strictly Necessary)
      我们通过 Google Analytics (Strictly Necessary) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Strictly Necessary) 隐私政策
      Typepad Stats
      我们通过 Typepad Stats 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Typepad Stats 隐私政策
      Geo Targetly
      我们使用 Geo Targetly 将网站访问者引导至最合适的网页并/或根据他们的位置提供量身定制的内容。 Geo Targetly 使用网站访问者的 IP 地址确定访问者设备的大致位置。 这有助于确保访问者以其(最有可能的)本地语言浏览内容。Geo Targetly 隐私政策
      SpeedCurve
      我们使用 SpeedCurve 来监控和衡量您的网站体验的性能,具体因素为网页加载时间以及后续元素(如图像、脚本和文本)的响应能力。SpeedCurve 隐私政策
      Qualified
      Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

      icon-svg-hide-thick

      icon-svg-show-thick

      改善您的体验 – 使我们能够为您展示与您相关的内容

      Google Optimize
      我们通过 Google Optimize 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Google Optimize 隐私政策
      ClickTale
      我们通过 ClickTale 更好地了解您可能会在站点的哪些方面遇到困难。我们通过会话记录来帮助了解您与站点的交互方式,包括页面上的各种元素。将隐藏可能会识别个人身份的信息,而不会收集此信息。. ClickTale 隐私政策
      OneSignal
      我们通过 OneSignal 在 OneSignal 提供支持的站点上投放数字广告。根据 OneSignal 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 OneSignal 收集的与您相关的数据相整合。我们利用发送给 OneSignal 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. OneSignal 隐私政策
      Optimizely
      我们通过 Optimizely 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Optimizely 隐私政策
      Amplitude
      我们通过 Amplitude 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Amplitude 隐私政策
      Snowplow
      我们通过 Snowplow 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Snowplow 隐私政策
      UserVoice
      我们通过 UserVoice 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. UserVoice 隐私政策
      Clearbit
      Clearbit 允许实时数据扩充,为客户提供个性化且相关的体验。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。Clearbit 隐私政策
      YouTube
      YouTube 是一个视频共享平台,允许用户在我们的网站上查看和共享嵌入视频。YouTube 提供关于视频性能的观看指标。 YouTube 隐私政策

      icon-svg-hide-thick

      icon-svg-show-thick

      定制您的广告 – 允许我们为您提供针对性的广告

      Adobe Analytics
      我们通过 Adobe Analytics 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Adobe Analytics 隐私政策
      Google Analytics (Web Analytics)
      我们通过 Google Analytics (Web Analytics) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Web Analytics) 隐私政策
      AdWords
      我们通过 AdWords 在 AdWords 提供支持的站点上投放数字广告。根据 AdWords 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AdWords 收集的与您相关的数据相整合。我们利用发送给 AdWords 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AdWords 隐私政策
      Marketo
      我们通过 Marketo 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。我们可能会将此数据与从其他信息源收集的数据相整合,以根据高级分析处理方法向您提供改进的销售体验或客户服务体验以及更相关的内容。. Marketo 隐私政策
      Doubleclick
      我们通过 Doubleclick 在 Doubleclick 提供支持的站点上投放数字广告。根据 Doubleclick 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Doubleclick 收集的与您相关的数据相整合。我们利用发送给 Doubleclick 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Doubleclick 隐私政策
      HubSpot
      我们通过 HubSpot 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。. HubSpot 隐私政策
      Twitter
      我们通过 Twitter 在 Twitter 提供支持的站点上投放数字广告。根据 Twitter 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Twitter 收集的与您相关的数据相整合。我们利用发送给 Twitter 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Twitter 隐私政策
      Facebook
      我们通过 Facebook 在 Facebook 提供支持的站点上投放数字广告。根据 Facebook 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Facebook 收集的与您相关的数据相整合。我们利用发送给 Facebook 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Facebook 隐私政策
      LinkedIn
      我们通过 LinkedIn 在 LinkedIn 提供支持的站点上投放数字广告。根据 LinkedIn 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 LinkedIn 收集的与您相关的数据相整合。我们利用发送给 LinkedIn 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. LinkedIn 隐私政策
      Yahoo! Japan
      我们通过 Yahoo! Japan 在 Yahoo! Japan 提供支持的站点上投放数字广告。根据 Yahoo! Japan 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Yahoo! Japan 收集的与您相关的数据相整合。我们利用发送给 Yahoo! Japan 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Yahoo! Japan 隐私政策
      Naver
      我们通过 Naver 在 Naver 提供支持的站点上投放数字广告。根据 Naver 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Naver 收集的与您相关的数据相整合。我们利用发送给 Naver 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Naver 隐私政策
      Quantcast
      我们通过 Quantcast 在 Quantcast 提供支持的站点上投放数字广告。根据 Quantcast 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Quantcast 收集的与您相关的数据相整合。我们利用发送给 Quantcast 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Quantcast 隐私政策
      Call Tracking
      我们通过 Call Tracking 为推广活动提供专属的电话号码。从而,使您可以更快地联系我们的支持人员并帮助我们更精确地评估我们的表现。我们可能会通过提供的电话号码收集与您在站点中的活动相关的数据。. Call Tracking 隐私政策
      Wunderkind
      我们通过 Wunderkind 在 Wunderkind 提供支持的站点上投放数字广告。根据 Wunderkind 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Wunderkind 收集的与您相关的数据相整合。我们利用发送给 Wunderkind 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Wunderkind 隐私政策
      ADC Media
      我们通过 ADC Media 在 ADC Media 提供支持的站点上投放数字广告。根据 ADC Media 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 ADC Media 收集的与您相关的数据相整合。我们利用发送给 ADC Media 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. ADC Media 隐私政策
      AgrantSEM
      我们通过 AgrantSEM 在 AgrantSEM 提供支持的站点上投放数字广告。根据 AgrantSEM 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AgrantSEM 收集的与您相关的数据相整合。我们利用发送给 AgrantSEM 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AgrantSEM 隐私政策
      Bidtellect
      我们通过 Bidtellect 在 Bidtellect 提供支持的站点上投放数字广告。根据 Bidtellect 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bidtellect 收集的与您相关的数据相整合。我们利用发送给 Bidtellect 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bidtellect 隐私政策
      Bing
      我们通过 Bing 在 Bing 提供支持的站点上投放数字广告。根据 Bing 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bing 收集的与您相关的数据相整合。我们利用发送给 Bing 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bing 隐私政策
      G2Crowd
      我们通过 G2Crowd 在 G2Crowd 提供支持的站点上投放数字广告。根据 G2Crowd 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 G2Crowd 收集的与您相关的数据相整合。我们利用发送给 G2Crowd 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. G2Crowd 隐私政策
      NMPI Display
      我们通过 NMPI Display 在 NMPI Display 提供支持的站点上投放数字广告。根据 NMPI Display 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 NMPI Display 收集的与您相关的数据相整合。我们利用发送给 NMPI Display 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. NMPI Display 隐私政策
      VK
      我们通过 VK 在 VK 提供支持的站点上投放数字广告。根据 VK 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 VK 收集的与您相关的数据相整合。我们利用发送给 VK 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. VK 隐私政策
      Adobe Target
      我们通过 Adobe Target 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Adobe Target 隐私政策
      Google Analytics (Advertising)
      我们通过 Google Analytics (Advertising) 在 Google Analytics (Advertising) 提供支持的站点上投放数字广告。根据 Google Analytics (Advertising) 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Google Analytics (Advertising) 收集的与您相关的数据相整合。我们利用发送给 Google Analytics (Advertising) 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Google Analytics (Advertising) 隐私政策
      Trendkite
      我们通过 Trendkite 在 Trendkite 提供支持的站点上投放数字广告。根据 Trendkite 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Trendkite 收集的与您相关的数据相整合。我们利用发送给 Trendkite 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Trendkite 隐私政策
      Hotjar
      我们通过 Hotjar 在 Hotjar 提供支持的站点上投放数字广告。根据 Hotjar 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Hotjar 收集的与您相关的数据相整合。我们利用发送给 Hotjar 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Hotjar 隐私政策
      6 Sense
      我们通过 6 Sense 在 6 Sense 提供支持的站点上投放数字广告。根据 6 Sense 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 6 Sense 收集的与您相关的数据相整合。我们利用发送给 6 Sense 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. 6 Sense 隐私政策
      Terminus
      我们通过 Terminus 在 Terminus 提供支持的站点上投放数字广告。根据 Terminus 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Terminus 收集的与您相关的数据相整合。我们利用发送给 Terminus 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Terminus 隐私政策
      StackAdapt
      我们通过 StackAdapt 在 StackAdapt 提供支持的站点上投放数字广告。根据 StackAdapt 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 StackAdapt 收集的与您相关的数据相整合。我们利用发送给 StackAdapt 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. StackAdapt 隐私政策
      The Trade Desk
      我们通过 The Trade Desk 在 The Trade Desk 提供支持的站点上投放数字广告。根据 The Trade Desk 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 The Trade Desk 收集的与您相关的数据相整合。我们利用发送给 The Trade Desk 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. The Trade Desk 隐私政策
      RollWorks
      We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

      是否确定要简化联机体验?

      我们希望您能够从我们这里获得良好体验。对于上一屏幕中的类别,如果选择“是”,我们将收集并使用您的数据以自定义您的体验并为您构建更好的应用程序。您可以访问我们的“隐私声明”,根据需要更改您的设置。

      个性化您的体验,选择由您来做。

      我们重视隐私权。我们收集的数据可以帮助我们了解您对我们产品的使用情况、您可能感兴趣的信息以及我们可以在哪些方面做出改善以使您与 Autodesk 的沟通更为顺畅。

      我们是否可以收集并使用您的数据,从而为您打造个性化的体验?

      通过管理您在此站点的隐私设置来了解个性化体验的好处,或访问我们的隐私声明详细了解您的可用选项。