AU Class
AU Class
class - AU

Automating and Improving Revit Model Quality for Advanced Construction with the Cloud Autodesk Validation Tool

共享此课程
在视频、演示文稿幻灯片和讲义中搜索关键字:

说明

Learn from Exyte how to introduce the critical role that Autodesk Validation Tool plays in orchestrating hundreds of Revit models with unfailing precision for high-tech factories. Autodesk Validation Tool is not just a tool, but a revolution that takes project delivery to a new level, ensuring accuracy and efficiency from the start. See from Exyte how Autodesk Validation Tool automates model validation, increases quality, minimizes errors, and speeds up the planning process—enabling teams to not only meet, but exceed, the ever-increasing demands of chip fabs. This advanced solution puts Exyte at the forefront of the industry. Discover the real-time dashboards that constantly pulse with the heartbeat of model integrity across megaprojects. This session will show how cloud-based validation comes together with precision engineering to set new paradigms in the design of innovative chip fabs. Join us and experience the transformative impact that Autodesk Validation Tool has on project delivery, operational efficiency, and quality benchmarking for complex designs.

主要学习内容

  • Discover Autodesk Validation Tool for automating model validation, quality assurance, error minimization, and the acceleration of planning processes.
  • Learn how Exyte successfully uses Autodesk Validation Tool to ensure the accuracy and efficiency of project implementation right from the start.
  • Learn more about Autodesk Validation Tool software's real-time dashboards, which provide a full overview of model integrity in megaprojects.

讲师

  • Tineshwaran Sellathoroe 的头像
    Tineshwaran Sellathoroe
    Tinesh is a dedicated BIM enthusiast with over a decade of experience in the AEC industry. He currently serves as the BIM Regional Head at Exyte Malaysia and holds the position of BIM Lead for Exyte's Data Center (DTC) business unit in the South East Asia (SEA) region. With a profound passion for design, construction, and technology, Tinesh is recognized for his pioneering role in digital transformation, advocating innovative approaches in industry. He began his career at one of Malaysia's largest construction firms, where he successfully developed and implemented a visual-based planning platform. This platform streamlined daily lean job cards and comprehensive reporting, marking early achievements in his career. Tinesh later advanced to champion virtual building performance analysis applications in an MEP consulting firm, significantly enhancing project efficiencies. Here, he also developed a centralized repository for aggregating and managing engineering and design data from past projects, leveraging it to define standards and act as a dynamic reference tool for upcoming projects. In his current role, Tinesh led BIM standardization across multiple high-profile projects for renowned clients. Notably, he has successfully overseen and delivered the BIM scope for cloud/hyperscale data center project. He spearheads efforts in digital standardization and implementation across Exyte Malaysia and the Data Center (DTC) business unit in the SEA region. Tinesh is a chartered Professional Technologist in Malaysia and holds numerous certifications relevant to BIM. His expertise spans across diverse project types, including residential, commercial, data centers, and specialized high-tech facilities. In summary, Tinesh brings extensive experience across the entire BIM lifecycle, from space planning (pre-tender) to facility management (post-construction). His strong skill set and passion for driving digital transformation underscore his commitment to advancing industry practices.
  • Martin Langolf 的头像
    Martin Langolf
    Technical Account Specialist for the EMEA region, working at Autodesk for over 9 years. Certified for 2011 AutoCAD Product Family, with over 10 years of professional experience in the Architectural and Engineering area and many projects true out Northern Europe. My main goal is to ensure that the work is completed to the highest possible standards and to all parties satisfaction. No matter what the challenges are! Specialties - English/German and Russian technical support, challenging difficult situations. Product knowledge - Autodesk Live, Revit, Navisworks (basic), AutoCAD Architecture, AutoCAD MEP, AutoCAD, AutoCAD LT, DWG True View, Design Review, Licensing, Installation, Deployment, Windows, Mac OS, Customer Service, Customer Success, Presentations, Webinars, Engineering, Architecture, CAD, Software, Hardware
Video Player is loading.
Current Time 0:00
Duration 40:10
Loaded: 0.41%
Stream Type LIVE
Remaining Time 40:10
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

TINESHWARAN SELLATHOROE: Hello, guys. Welcome to our case study sharing. We are looking at automating and improving Revit model quality with Autodesk Validation Tool. Now, this is going to be by myself, Tinesh from Exyte, and my good friend, Martin, who's from Autodesk.

A quick introduction about myself. I'm Tinesh. I am from Malaysia, and am I am the BIM Regional Head in Exyte Malaysia. And I also lead the BIM efforts in Exyte's data center business unit for Southeast Asia region. I have over 10 years experience in the AEC industry. Currently, I am focused on digital standardization, R&D, and, of course, the implementation of BIM across Exyte Malaysia, again, more focused in data centers currently.

I've done various types of projects throughout my career, ranging from residential, commercial, high-tech facility, and data centers. I've experienced a large part of BIM lifecycle from various stakeholder perspective, all the way from space planning to facility management. So that's pretty much about myself. I'll pass it to Martin to introduce himself.

MARTIN LANGOFF: Thank you, Tinesh. Hello, everyone. My name is Martin Langoff. I'm a technical account specialist working for over 12 years at Autodesk based in our Munich office in Germany. And my main focus is to help our EBA customer to adapt our AEC solutions and be much more efficient in their daily business. Thank you, Tinesh. Back over to you.

TINESHWARAN SELLATHOROE: Thanks, Martin. Right, a quick look at the agenda for today. We are going to look at the introduction of the company where I'm from, Exyte, and a specific project, which made us look at how we can improve the way we do things. From that project, we will talk about issues we encountered and how we solved it. We will, of course, talk about the solution itself, which is Autodesk Validation Tool. Martin will then talk about the check sets and custom solution that was developed. And I'll top it off by talking about the result and summary.

Right, about my company, Exyte. So in Exyte, we believe that we create better future by delivering high-tech facilities which then enables the client to enhance the quality of modern life with their product and services. So this is us. We have over 100 years of history. We recorded over $7.4 billion euros of sales in 2022. We are over in over 20 countries worldwide, and we have about 9,700 employees worldwide. This was recorded last year.

And this is our German engineering heritage. Over the past 100 years, we started off as M plus W in 1912, and we were rebranded as Exyte in 2018. And ever since, we never stopped growing. Exyte Worldwide, we are here in US. We are in continental Europe. We are headquartered in Stuttgart. We are also in Northern Europe and Israel. The headquarters is in Ireland. We are also in India. We are in Northeast Asia. We have a lead office in Shanghai. And, of course, we are in Southeast Asia. The lead office is in Singapore. And, of course, where I'm from, Exyte Malaysia.

Now, let's get to the business. Have you been in the situation? Have you had large number of models and revisions, maybe uncommonly large? You have too many cooks in the kitchen, which is often the case. But maybe when the project is large, you just can't help to have large number of trade partners. You have stringent requirements by your client-- I would say smart client, which is very common these days, which is good because in the project, we know where we are heading towards. But it could be bad when they just want everything. And lastly, have you been facing frequent quality issues with your models?

Now, we face all the issue that we mentioned-- I mentioned earlier in this particular project, which is codenamed Project Bird. So I'm going to give you a quick introduction about Project Bird just to share the immense scale of it. So this Project Bird, it's a high-tech facility construction.

This is a first overseas facility for advanced 3D chip packaging. It is one of the largest chip packaging facility in the world, and it is the largest one in Malaysia so far. This particular project was valued over $7 billion or $30 billion ringgit, Malaysia. We had over 150 trade partners in this project during the peak, and we also had over 700 models in this project. And all these models are coming from our suppliers, subcontractors, nominated subcontractors, and so on.

The issues that we face. As I did mention, the project involved numerous parties, a large number of it. And of course, that means a large number of models, which cost us complicated coordination and quality management. Some of our trade partners or subcontractors, they were using and relying on downloaded content, which was contributing to the inconsistency. And it was proving to be a risk given that the client was actually looking to take this model all the way to their facility management platform.

Our BIM team in the project, believe it or not, despite being big-- but due to the number of models and the trade partners to manage, we were short handed. And as we were looking into many things to do-- in terms of completing the construction model, ensuring clash coordination is being carried out, ensuring the shop product shop drawing production is on time, and many more things-- we were actually lacking resources to implement a comprehensive quality control measure. And again, without a quality control measure, it means there is no quality control gate. And we have all these sort of creeping quality issues which was visible all the way to the client.

So at the early stage of the project, we were swamped with excessive quality issues. And it was proving to be a challenge to even identify or monitor and rectify these issues. So we were at a stage where our models had data integrity problems. The models were carrying data-- informational data that's not accurate and outdated. And then, of course, all this led to declining trust in BIM as a single source of truth. And these issues were undermining stakeholder confidence on Exyte.

So we identified a solution which we hoped would help us solve this issue we were facing because we wanted something that could help us without requiring much effort from the team-- something that could address the situation that we are in, help to resolve the issue that was ongoing. So we explored, and we looked at Autodesk Validation Tool as our solution.

Now, a quick introduction of what Autodesk Validation Tool, or AVT is, and what are the capabilities. So AVT is a fully-automated cloud solution. It takes advantage of cloud computing power to analyze Revit models very quickly as per your requirement. This mainly helps to check the informational data of your model.

In AVT, you have customizable or preset check sets. So check sets are basically what you are validating your models against. So this can be, against a preset check set which is already in the system, the standard one that you can just use. Or you can further customize this check and then decide what you want to validate your model against. So Martin is going to explain this in detail later.

Now, with AVT, you get to speed up your model error resolution process. Can you just imagine you're opening up a model, reviewing it, and then manually checking it for some informational data accuracy, eventually saving it, and then closing it? It could be quick if the model is very small. But what happens if the model is large? And what happens if you have a large number of models to do this? So you know that this consumes a lot of time. So we used AVT to cut this down, where we automated the checks we wanted. And we also use the data that was generated for visualization on custom dashboards. We're going to look into this later.

And when we used AVT, we had a sort of assurance of the model's performance and integrity. We knew that whichever models that passes the checks is reliable and good to go. And whichever doesn't, we know exactly what is wrong and what needs to be corrected.

And lastly-- you're going to hear this a lot-- AVT is a very scalable solution. We used it on a very large project, and we know this would function the same no matter what the project size is and no matter how many different types of projects you've had. You set a template, and then you roll it out to all the projects that you want at a go. Now, let's take a step back for me to explain to you why did we decide to implement AVT, why we thought AVT we would be able to solve the issues we had, and what was AVT offering beyond that.

Now, to start off, AVT was repeatable. It is a fully-automated workflow in cloud. For the reason I mentioned, and due to the large number of models, we wanted a solution that would do the work for us. AVT is scalable. We saw AVT as our quality control gate. We wanted a solution that would allow us to monitor the project metrics.

We wanted to use the data to look at trends over time, and we wanted to know if the issues are on a downward trend. And now that it's being addressed and being communicated, we wanted to increase the model performance, again, going back to solving the issues we faced. And, of course, to save time and money. This is as we cut down the time needed for our resources to carry out manual jobs and as we identify the error ahead. So instead of hiring another person to just purely carry out the QC works, we decided to just hire AVT.

Now, we had our specific requirement for the model itself. We wanted to check very specific things. Initially, we were using presets or standard check sets that came with AVT, the common ones, which are called Revit Best Practices-- Martin will explain to you. These are the ones that we're using to validate our model against.

But as we were seeing how much this was helping us, we were curious to see if the checks can be done on our specific requirement. And we believe these are some key areas which require attention. So I'm just going to quickly go through a few.

First was the file naming standard down to the basic. We wanted to know if the models are being named correctly. As basic as you think this may be, you all would know how important this is. We had a naming convention which was complex to a certain extent to identify the models. And handling a large number of models, it was never easy to check all one-by-one.

Location, project, location settings. As we have various stakeholders from different part of the world, despite having all sorts of information and templates being provided, this was still being an issue. So we wanted to make sure that the project control points are accurate, as our shop drawings are-- and the construction drawings are actually being produced from the model. The coordinates were actually being generated from models. Hence, it was very crucial for us to have this correct.

And of course, many more project-relevant metrics and parameters that we wanted to check. Some includes model composition, the imported contents, in-place families, any sort of 2D contents. And we wanted to check the view compositions. We wanted to know the number of views that are there, but not on sheets. So where are they there, the governance aspects of it-- we want to look at the errors, the warnings, whether you can purge this model, and even family-related information, such as names, system types, and others.

And lastly, our requirement was, again, to have something that'd able to check all our models, all 700 models to all our subcontractors. And most importantly, monitor them in terms of their model status trend.

So this is what we needed from Autodesk. Validation Tool. We wanted to carry out model reviews-- some very specific checks like I've shared earlier. And we want this to be carried out at a very fixed frequency automatically. We didn't want to prompt the checks manually, and we didn't want having to go through every single one model of ours and then prompting a check. And, of course, we wanted this to be done to all our models.

We know the standard check sets, and we were using it. But after using it, we have the need for more. We wanted to customize it. Also, we wanted to learn how to customize this. So we wanted to generate reports. We wanted these reports to be centralized and accessible to all relevant stakeholders. But we also wanted a live dashboard to monitor the subcontractor's performance, to understand the model health, all from the data and the information that's being generated by AVT.

So we wanted to bring this data out from the reports and put it on a live dashboard that's being updated frequently. And of course, we reached out to Autodesk for technical support and consultation. Now, I'm going to quickly touch on some basic practice with AVT, and then I'll pass it to Martin to share some customization and integration that was done in Exyte.

So AVT interface, it can be accessed online from avt.com. Bear in mind you first have to authorize the app from your CDE, BIM 360, or ACC, which allows you to view the models that start in the AVT interface. From there, you don't have to download the models, or even don't-- you don't even need to have Revit. And you can run the checks from the models that you want from the selected CDE.

Again, here in AVT, you can schedule checks. You can do it now, or you can do it on a certain interval, or you can do it as someone is publishing the model. So in this platform, you are also able to download the reports directly or store it into your CDE.

So the check sets that are available here, you can customize and you can create it in this interface. And it will also be available in the model checker interface, which I'm going to show next. You will have the same library of checks from the model checker. There are some sample Power BI templates which you can also download and visualize the data that you download from here. So if you think the report is boring, just use the information, use the Power BI template, and then translate it to something that's more interesting.

And again, this allows you to run checks for all models at scale. And in a way, this is actually a very summarized way of using AVT. There are many resources online on how you can do this. So we can always have a separate course or session if you need to implement and use this solution at your workplace.

While the usage of AVT is mainly in the cloud interface, as I said, there is also a model checker plugin under the BIM Interoperability Tool, which allows you to carry out similar actions that you could do from AVT online. So from this plugin, you can, again, edit check, set, customize, or create a check set from scratch using Configurator. It's an easy visit, which helps you through the process. Martin is going to explain in detail after this.

And from here, you can also, again, automate and schedule the checks similar to what you can do in AVT online. And once you have generated a report from a check, be it manual or scheduled check, you can view the reports or you can share it and let the owner of the model view the report locally by launching Revit, running the model checker plugin, and running the shared report.

So this plugin then shares the report-- or certificates, as we call it-- of the model, detailing what check was done, and what was the error, and viewing the relevant element, which is tied up to the error. So you could choose to either do it on the online interface that I showed previously or via the Model Checker plugin. It is similar and it offers the same functionality.

As mentioned in the previous slide, the reports generated are shareable. You can download a file format, which is stored AVT. And you can send it to anyone who could be working on this model. And they can open the Model Checker plugin and view the report. This is very useful, as they can clearly view which element requires attention, and they can work on the correction based on that.

So apart from having a dashboard which tells them, your model needs correction, this report actually tells them what needs to be corrected. Of course, viewing the model itself, you can't do it on AVT online, where you open and view the model and see the element that's affected. But you can do it on a Model Checker. It becomes very useful. And trust me, this is something that we've experienced before in having a clear communication with your trade partner.

So that summarizes my part at the currently, I'm just going to pass it to Martin to show how we customize this solution further.

MARTIN LANGOFF: Thank you. Tinesh. So yes, as Tinesh already mentioned, the out-of-the-box check sets, they were not sufficient enough for the use case that Exyte had on this project. So we had to customize it even further. And based on the requirements that we received from Exyte, we helped them to achieve certain customizations, to identify certain parameters that may have not a value set. And of course, we would also be able to customize or integrate these custom check sets, and then visualize the results and the reports that can be distributed to all the stakeholders involved, to all your designers.

And in order to easily identify the elements that are having this issue or the missing parameter values, by simply selecting it from the report, the little magnifying glass, and within Revit, you will be zoomed in to the right view. And the element affected will be already pre-selected. So it is much, much easier for the end user to correct the issue in the model. Next slide, please.

Also AVT in practice. So by default, as well as inside the Model Checker, we are able to output two main file formats, which is the HTML report. That is a very similar report that can be distributed via the AVT file. And also the Excel reports that can be used to feed a Power BI dashboard in order to visualize all those results. But you can, of course, also do your analysis within Excel directly.

So out of the box, we already provide, with AVT, a large amount of predefined check sets-- for example, the Revit best practices that I will touch on in just a moment-- and already pre-configured Power BI dashboards that can help you to visualize those reports. So you can have a very easy and understandable overview of the problematic areas. And in the process of correcting those areas within your Revit models, you would be much more efficient. So we can go.

So let's speak about the check sets, which is basically an XML file that is used in order to create the rules that you are using to validate your models against.

So here, we can see a large list of the predefined and publicly available check sets. This is something that we provide with the Autodesk Validation Tool, and we are offering over 30 different check sets that you can basically use out of the box, and you can get started with AVT right away. So if we look at the example checkset of AVT best practices-- so this particular check set focuses on identifying key areas, like the model performance and any potential issues with it. So let's say the amount of warnings or the purgeable elements.

You can also identify any information of room spaces and areas within your Revit model. So, for example, how many of each exist. You are also able to check any view-related information. So how many views exists within your model, are there any schedule sheets or scope boxes created, and so on. And you can also validate element information. So are there any elements placed on wrong work sets, and this type of information. So are there any duplicates present within your model?

So all this can be validated with the Revit best practices check set right away. So if you are looking forward to get started with it, you may use this check set right out of the box and validate your models for this information.

And also, if you would like to create your own check sets, the Configurator within the interoperability tools offers you different options. So you can use the wizard, which basically is an option that will guide you through the process of creating a custom check. And you can use, actually, this particular example if, for instance, you would like to validate the custom parameter fire rating within doors, and you would also like to validate its value. In this example, we are using the value 30.

And at the same time, if you would like to compare it to the host element, you would be able to include-- by following the step-by-step guide, include this additional condition to your check set to validate also the fire rating parameter within the host element. And if those values do not match, it will show up in the report as a result, positive or negative, or even neutral, like a simple list of results. If you have multiple issues identified, that can be, again, easily distributed to your end users and corrected within the model.

And for the more advanced users who potentially have already created some custom check sets, you can also use the Advanced Check Builder that allows you to work with operators, criterias, properties and different conditions in order to build a check rule that can help you to validate certain information.

So in this particular example, we would be validating a custom parameter called-- let's call it Catalog Item Number. And we want to validate its value, that it is set to a specific custom code. So here, you have a little snippet of a custom code that is based on a naming convention.

And if the parameter value is set to a value that doesn't correspond to this custom code, it will basically generate a negative result that will affect your Revit model health score. And via the report, you can, again, also distribute those reports to your end users that can easily identify the parameters or the elements with the parameters set for wrong values and easily correct it within the Revit model.

So at Exyte, we had to develop a custom solution in order to deal with the very large amount of frequent Revit model updates, and also with the large amount of the reports that were generated by AVT. So for this, we have actually developed in the custom domain a fully automated solution where all the different project stakeholders, they are frequently updating the Revit models in the Autodesk Construction Cloud in the CDE, where all the project-relevant data was hosted. And every time a Revit model update occurred, AVT would automatically get active and validate those models against the custom check sets that were developed for Exyte.

So once the models have been validated, it will, of course, generate a very large amount of the different reports. And in order to deal with this amount of reports, we actually had to integrate those reports into the Microsoft Azure Data Factory using an ACC Connect recipe, where we were able to connect to the AVT webhook and transfer all those reports into the Azure Data Factory.

In the Azure Data Factory, the reports have been manipulated, modified in order to create a single report that can be fed into the Power BI dashboard. And this Power BI dashboard can also be integrated back into the Autodesk Construction Cloud in Insights. And every stakeholder would be able to use this dashboard in order to identify the problematic areas.

At the same time, they could consume the AVT file inside Revit and interact with that report directly within Revit. And of course, as more model updates occur, the Autodesk Validation Tool will re-evaluate all those models. And basically, we are closing the cycle of updating the existing reports, feeding the Power BI with the new information, and make it available on the project for all the stakeholders. So this was basically the quality gate that allowed us to proceed and deal with all this, a large amount of information in order to speed up the construction process.

So how did we do it on-- within the customer domain? So we used the Azure environment in order to deal with all the incoming Excel sheets that are stored in the storage account of the customer. And we use the Data Studio in order to move all this data into SQL tables, in order to generate a single report that can then be used in Power BI in order to publish all this information and make it available for all the different stakeholders.

So we have used the data sets and-- in the Data Studio of Azure. And there, you can basically define the data sets that you need. And you are able to manage many different file formats and file types. Also, you are able to create a data flow that allows you to perform certain manipulations of the data, like joining the data, splitting it up, or even aggregating the data by using many schemas that are available.

And you would also be able to define a pipeline that can be used with the different data sets and data flows, and provides you many possible actions, loops, and decision possibilities that are very useful in order to work with a large amount of data. And you can also set up triggers. So whenever something in the pipeline, the trigger would basically occur based on the different events, time, or even storage changes. Also, when you are developing a solution like that, you can also trigger it on demand in order to test your solutions and see what is currently in the pipeline.

And after the data, or after your pipeline or your workflow has been created, you, of course, are also able to monitor what is happening with the pipeline. This can also help you to ease up any analysis that you may need to perform when errors occur with the data. So this is a very useful and helpful environment in order to deal with a large amount of data.

And the result of all this was basically the custom-developed dashboard that Tinesh will speak in just a moment more about this, and that allowed everyone to visualize this data and to have, basically, call to actions generated in order to address the problematic areas that were identified. And, Tinesh, back over to you to speak more about the efforts and results that you are getting from this solution.

TINESHWARAN SELLATHOROE: All right. Thanks, Martin. Now, allow me to just quickly run through some of the results of our effort. So we generated dashboard, as what Martin has mentioned, using Power BI. So this is a very simplified version for the sharing sake.

The purpose was we wanted to create a dashboard that allows us to filter the result of checks on weekly basis or on the interval that we select by suppliers. So this helps us to understand the state of the model, if the version that has been shared or published is any good. And of course, we get to also do a sort of benchmarking between contractors or between the trade partners who's involved. Yeah, we are able to review, monitor each and every subcontractors that's involved in the project.

So we were also looking at trends to see if the issues are on a downward trends now, that is being communicated. Or if it's not being addressed, and if the issues are increasing-- so that then allows us to create a specific clinic sessions with particular subcontractors to understand what is the issue and working together in ensuring the model has the proper quality. And this information also allows us to use the data as a performance indicator over the long run. So by the end of the project, we actually know the subcontractors which will be improving, subcontractors who did not, or subcontractors who actually got worse.

And another option was also to filter and create a detailed view by subcontractors. Again, this is more useful to understand what exactly the error is. We often use this in meetings, in workshops, or in the clinics that I've mentioned earlier.

We've also embedded the dashboard to our ACC insight. So these are among the first thing that everyone in the project will see when they enter ACC. So yeah, it's more about being transparent and more about saying that there are certain issues that you have to address, and addressing it is very important. And yeah, you have to work on it.

Right, and some of the result. This is a quote from our client who was with us throughout the journey, who witnessed the improvement, who was also being very supportive. And they were sharing how it supported their plan to further use the models that we have created, the information that we are compiling into the model into the FM platform. So now you understand how severe it is not to have any information that is not relevant and how important it is to have updated and relevant information being tagged into the models.

I'm going to give a very quick summary of the solution that we've discussed, a solution that we were practicing, and, of course, a solution that Martin showed. In general, from our perspective, AVT can be used in two different workflows. So if you would like to keep it simple, you're a basic user, and you're happy to just have the tool to remove the manual work and reduce the time that's being spent to QC models, you can use the first workflow.

It consists of using AVT or Model Checker for manual or automated schedule checks. You can rely on the standard or the preset checks set, which is pretty good, to be honest. We were relying on that very early on. And you can choose to view the report either online, or on the Model Checker, or download it and use it on an Excel as you wish to carry out any sort of correction, or convey this information down to the relevant parties to do it.

So the second workflow, I would say this is slightly more detailed, would then, again, be carrying out the checks on AVT-- again, on AVT-- sorry, on Model Checkers for the manual or automated schedule checks. You can use presets. Or now you can take it a step higher, and you can create custom checks set. Either you work on a standard check set, you customize it, or you build it from scratch as what Martin has shared.

Then the next one, you can view the reports to download it. But you can also take it to another step where you can use the available templates that has been shared by Autodesk to view these reports on Power BI. Of course, a better visualization, which means better understanding for people or stakeholder who's involved.

But now, let's say you want to harness the data further and you also have a large number of models to review and monitor. So you don't have time to go through one-by-one. So what you do is that you do what we did. You use the Connect recipe and the webhook to pull the data from ACC environment to your very own domain, as shown earlier. And you can use the data to create a custom dashboard that suits your needs the most. It can be any sort of dashboard. It's all up to how you're going to manipulate and display the data. And yeah, that pretty much is the summary of how you can use AVT and how you can capitalize AVT's capability.

Now, to recap our overall presentation today, it all started with a challenge, which was the Revit model that was provided to us by the subcontractors did not meet the expectation or did not meet the requirement. This then led to extensive revalidation effort by our BIM team, and there was no capability to validate specific parameters and their values automatically. But of course, this can be done manually, and this was consuming a lot of time.

So the solution, we started using AVT's basic function to QC model. We were using the preset check sets. We were downloading the reports, and then we were sharing to the relevant parties to fix it. And all we wanted was to automate this, and we wanted to customize the checks that we can do.

We had consultation and support sessions with Autodesk, and we had coaching sessions with the entire team. We then further collaborated with Autodesk, which allowed us a creation of a fully automated solution which validates the Revit models that has been published, and then supports customized dashboard.

So then we also develop custom check sets as per our specific requirement, which I shared earlier. The Power BI dashboards were set up to visualize the data validation results, and we also embedded that into Insights. And the result, with AVT, we were able to detect the model status at one go for all subcontractors, and all-- for all the models. That's the most important part of it. And all these custom checks sets actually allowed a validation of specific parameters within Revit models. This is what we couldn't do earlier, which we were doing manually.

And the results are now visualized in custom Power BI dashboard, as shown earlier. And it is now being communicated to all subcontractors. All these identified issues are being let known to everyone who's involved. Yeah, and this was helping to expedite the resolutions.

We also had real-time access to project metrics in the dashboard in ACC Insight, as well, like I said. We were able to ensure model integrity now meets what was expected in the project. And this scalable solution also allows us to maintain and deploy the workflow to any other projects.

Now, I'm going to share an approximate return of investment from AVT implementation in Project Bird. Bear in mind, this was actually done with a certain level of assumption. Right, so we cataloged over 2,000 issues, and we priced each and every issue, say, at about $1,000 average per issue. And we then estimated over 70% savings, which brought us a saving of $1.4 million USD.

We reduced manual labors for the reasons I mentioned earlier, and we saved a lot. And this is estimated between 10,000 to 20,000 hours, which then comes up to $1 million US dollars. And with an estimated delay and inefficiency reduction would say $10 million. And we were modestly saving, say, 20% of it, and that was about $2 million, which then brings up the total estimated savings to approximately $4 million US dollars.

So this is, again-- of course, it's an estimated RoI with a few assumptions here and there. But we look forward to you trying it, and maybe capturing a more accurate RoI. And hopefully you could then share this to us one day.

I'm going to summarize by sharing the future of Autodesk Validation Tool in Exyte. What is our future plan? We're looking at all upcoming projects in Southeast Asia region to first adopt AVT a standard with plans for a global rollout. The AVT system, again, is designed to be scalable. So it allows seamless implementation across projects at-- in one go. A standardized-- And we actually have a standardized and well-documented AVT implementation. So for us, it's almost as simple as plug-and-play solution that can be integrated into any projects.

But believe me, using AVT is very straightforward. And at the end of the day, it will be a plug-and-play solution whether you have or you don't have a standard. But of course, it's always good to have it. And AVT will now be utilized across all our data center projects. This aligns with the demand of this high-speed project delivery. So it's not only complex, or a project that has high number of trade partners, or high number of models. But this also becomes very useful for a project that is expedited.

So, as all of you may know, the data center projects are always accelerated project, fast pace. So we believe AVT would fit in perfectly in this sort of projects. And of course, we will not stop customizing on what we can check and how we can exploit the data that we can get from AVT.

Right, and that wraps it up. Thank you again for watching this, and I hope you guys learned something useful. We look forward for you to implement AVT in your very own workspace, very own projects. And reach out to us if you have any questions or any queries. And we will help however we can. Thank you.

______
icon-svg-close-thick

Cookie 首选项

您的隐私对我们非常重要,为您提供出色的体验是我们的责任。为了帮助自定义信息和构建应用程序,我们会收集有关您如何使用此站点的数据。

我们是否可以收集并使用您的数据?

详细了解我们使用的第三方服务以及我们的隐私声明

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

通过这些 Cookie,我们可以记录您的偏好或登录信息,响应您的请求或完成购物车中物品或服务的订购。

改善您的体验 – 使我们能够为您展示与您相关的内容

通过这些 Cookie,我们可以提供增强的功能和个性化服务。可能由我们或第三方提供商进行设置,我们会利用其服务为您提供定制的信息和体验。如果您不允许使用这些 Cookie,可能会无法使用某些或全部服务。

定制您的广告 – 允许我们为您提供针对性的广告

这些 Cookie 会根据您的活动和兴趣收集有关您的数据,以便向您显示相关广告并跟踪其效果。通过收集这些数据,我们可以更有针对性地向您显示与您的兴趣相关的广告。如果您不允许使用这些 Cookie,您看到的广告将缺乏针对性。

icon-svg-close-thick

第三方服务

详细了解每个类别中我们所用的第三方服务,以及我们如何使用所收集的与您的网络活动相关的数据。

icon-svg-hide-thick

icon-svg-show-thick

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

Qualtrics
我们通过 Qualtrics 借助调查或联机表单获得您的反馈。您可能会被随机选定参与某项调查,或者您可以主动向我们提供反馈。填写调查之前,我们将收集数据以更好地了解您所执行的操作。这有助于我们解决您可能遇到的问题。. Qualtrics 隐私政策
Akamai mPulse
我们通过 Akamai mPulse 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Akamai mPulse 隐私政策
Digital River
我们通过 Digital River 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Digital River 隐私政策
Dynatrace
我们通过 Dynatrace 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Dynatrace 隐私政策
Khoros
我们通过 Khoros 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Khoros 隐私政策
Launch Darkly
我们通过 Launch Darkly 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Launch Darkly 隐私政策
New Relic
我们通过 New Relic 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. New Relic 隐私政策
Salesforce Live Agent
我们通过 Salesforce Live Agent 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Salesforce Live Agent 隐私政策
Wistia
我们通过 Wistia 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Wistia 隐私政策
Tealium
我们通过 Tealium 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Tealium 隐私政策
Upsellit
我们通过 Upsellit 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Upsellit 隐私政策
CJ Affiliates
我们通过 CJ Affiliates 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. CJ Affiliates 隐私政策
Commission Factory
我们通过 Commission Factory 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Commission Factory 隐私政策
Google Analytics (Strictly Necessary)
我们通过 Google Analytics (Strictly Necessary) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Strictly Necessary) 隐私政策
Typepad Stats
我们通过 Typepad Stats 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Typepad Stats 隐私政策
Geo Targetly
我们使用 Geo Targetly 将网站访问者引导至最合适的网页并/或根据他们的位置提供量身定制的内容。 Geo Targetly 使用网站访问者的 IP 地址确定访问者设备的大致位置。 这有助于确保访问者以其(最有可能的)本地语言浏览内容。Geo Targetly 隐私政策
SpeedCurve
我们使用 SpeedCurve 来监控和衡量您的网站体验的性能,具体因素为网页加载时间以及后续元素(如图像、脚本和文本)的响应能力。SpeedCurve 隐私政策
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

改善您的体验 – 使我们能够为您展示与您相关的内容

Google Optimize
我们通过 Google Optimize 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Google Optimize 隐私政策
ClickTale
我们通过 ClickTale 更好地了解您可能会在站点的哪些方面遇到困难。我们通过会话记录来帮助了解您与站点的交互方式,包括页面上的各种元素。将隐藏可能会识别个人身份的信息,而不会收集此信息。. ClickTale 隐私政策
OneSignal
我们通过 OneSignal 在 OneSignal 提供支持的站点上投放数字广告。根据 OneSignal 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 OneSignal 收集的与您相关的数据相整合。我们利用发送给 OneSignal 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. OneSignal 隐私政策
Optimizely
我们通过 Optimizely 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Optimizely 隐私政策
Amplitude
我们通过 Amplitude 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Amplitude 隐私政策
Snowplow
我们通过 Snowplow 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Snowplow 隐私政策
UserVoice
我们通过 UserVoice 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. UserVoice 隐私政策
Clearbit
Clearbit 允许实时数据扩充,为客户提供个性化且相关的体验。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。Clearbit 隐私政策
YouTube
YouTube 是一个视频共享平台,允许用户在我们的网站上查看和共享嵌入视频。YouTube 提供关于视频性能的观看指标。 YouTube 隐私政策

icon-svg-hide-thick

icon-svg-show-thick

定制您的广告 – 允许我们为您提供针对性的广告

Adobe Analytics
我们通过 Adobe Analytics 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Adobe Analytics 隐私政策
Google Analytics (Web Analytics)
我们通过 Google Analytics (Web Analytics) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Web Analytics) 隐私政策
AdWords
我们通过 AdWords 在 AdWords 提供支持的站点上投放数字广告。根据 AdWords 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AdWords 收集的与您相关的数据相整合。我们利用发送给 AdWords 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AdWords 隐私政策
Marketo
我们通过 Marketo 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。我们可能会将此数据与从其他信息源收集的数据相整合,以根据高级分析处理方法向您提供改进的销售体验或客户服务体验以及更相关的内容。. Marketo 隐私政策
Doubleclick
我们通过 Doubleclick 在 Doubleclick 提供支持的站点上投放数字广告。根据 Doubleclick 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Doubleclick 收集的与您相关的数据相整合。我们利用发送给 Doubleclick 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Doubleclick 隐私政策
HubSpot
我们通过 HubSpot 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。. HubSpot 隐私政策
Twitter
我们通过 Twitter 在 Twitter 提供支持的站点上投放数字广告。根据 Twitter 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Twitter 收集的与您相关的数据相整合。我们利用发送给 Twitter 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Twitter 隐私政策
Facebook
我们通过 Facebook 在 Facebook 提供支持的站点上投放数字广告。根据 Facebook 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Facebook 收集的与您相关的数据相整合。我们利用发送给 Facebook 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Facebook 隐私政策
LinkedIn
我们通过 LinkedIn 在 LinkedIn 提供支持的站点上投放数字广告。根据 LinkedIn 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 LinkedIn 收集的与您相关的数据相整合。我们利用发送给 LinkedIn 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. LinkedIn 隐私政策
Yahoo! Japan
我们通过 Yahoo! Japan 在 Yahoo! Japan 提供支持的站点上投放数字广告。根据 Yahoo! Japan 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Yahoo! Japan 收集的与您相关的数据相整合。我们利用发送给 Yahoo! Japan 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Yahoo! Japan 隐私政策
Naver
我们通过 Naver 在 Naver 提供支持的站点上投放数字广告。根据 Naver 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Naver 收集的与您相关的数据相整合。我们利用发送给 Naver 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Naver 隐私政策
Quantcast
我们通过 Quantcast 在 Quantcast 提供支持的站点上投放数字广告。根据 Quantcast 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Quantcast 收集的与您相关的数据相整合。我们利用发送给 Quantcast 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Quantcast 隐私政策
Call Tracking
我们通过 Call Tracking 为推广活动提供专属的电话号码。从而,使您可以更快地联系我们的支持人员并帮助我们更精确地评估我们的表现。我们可能会通过提供的电话号码收集与您在站点中的活动相关的数据。. Call Tracking 隐私政策
Wunderkind
我们通过 Wunderkind 在 Wunderkind 提供支持的站点上投放数字广告。根据 Wunderkind 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Wunderkind 收集的与您相关的数据相整合。我们利用发送给 Wunderkind 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Wunderkind 隐私政策
ADC Media
我们通过 ADC Media 在 ADC Media 提供支持的站点上投放数字广告。根据 ADC Media 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 ADC Media 收集的与您相关的数据相整合。我们利用发送给 ADC Media 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. ADC Media 隐私政策
AgrantSEM
我们通过 AgrantSEM 在 AgrantSEM 提供支持的站点上投放数字广告。根据 AgrantSEM 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AgrantSEM 收集的与您相关的数据相整合。我们利用发送给 AgrantSEM 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AgrantSEM 隐私政策
Bidtellect
我们通过 Bidtellect 在 Bidtellect 提供支持的站点上投放数字广告。根据 Bidtellect 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bidtellect 收集的与您相关的数据相整合。我们利用发送给 Bidtellect 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bidtellect 隐私政策
Bing
我们通过 Bing 在 Bing 提供支持的站点上投放数字广告。根据 Bing 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bing 收集的与您相关的数据相整合。我们利用发送给 Bing 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bing 隐私政策
G2Crowd
我们通过 G2Crowd 在 G2Crowd 提供支持的站点上投放数字广告。根据 G2Crowd 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 G2Crowd 收集的与您相关的数据相整合。我们利用发送给 G2Crowd 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. G2Crowd 隐私政策
NMPI Display
我们通过 NMPI Display 在 NMPI Display 提供支持的站点上投放数字广告。根据 NMPI Display 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 NMPI Display 收集的与您相关的数据相整合。我们利用发送给 NMPI Display 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. NMPI Display 隐私政策
VK
我们通过 VK 在 VK 提供支持的站点上投放数字广告。根据 VK 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 VK 收集的与您相关的数据相整合。我们利用发送给 VK 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. VK 隐私政策
Adobe Target
我们通过 Adobe Target 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Adobe Target 隐私政策
Google Analytics (Advertising)
我们通过 Google Analytics (Advertising) 在 Google Analytics (Advertising) 提供支持的站点上投放数字广告。根据 Google Analytics (Advertising) 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Google Analytics (Advertising) 收集的与您相关的数据相整合。我们利用发送给 Google Analytics (Advertising) 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Google Analytics (Advertising) 隐私政策
Trendkite
我们通过 Trendkite 在 Trendkite 提供支持的站点上投放数字广告。根据 Trendkite 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Trendkite 收集的与您相关的数据相整合。我们利用发送给 Trendkite 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Trendkite 隐私政策
Hotjar
我们通过 Hotjar 在 Hotjar 提供支持的站点上投放数字广告。根据 Hotjar 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Hotjar 收集的与您相关的数据相整合。我们利用发送给 Hotjar 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Hotjar 隐私政策
6 Sense
我们通过 6 Sense 在 6 Sense 提供支持的站点上投放数字广告。根据 6 Sense 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 6 Sense 收集的与您相关的数据相整合。我们利用发送给 6 Sense 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. 6 Sense 隐私政策
Terminus
我们通过 Terminus 在 Terminus 提供支持的站点上投放数字广告。根据 Terminus 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Terminus 收集的与您相关的数据相整合。我们利用发送给 Terminus 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Terminus 隐私政策
StackAdapt
我们通过 StackAdapt 在 StackAdapt 提供支持的站点上投放数字广告。根据 StackAdapt 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 StackAdapt 收集的与您相关的数据相整合。我们利用发送给 StackAdapt 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. StackAdapt 隐私政策
The Trade Desk
我们通过 The Trade Desk 在 The Trade Desk 提供支持的站点上投放数字广告。根据 The Trade Desk 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 The Trade Desk 收集的与您相关的数据相整合。我们利用发送给 The Trade Desk 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. The Trade Desk 隐私政策
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

是否确定要简化联机体验?

我们希望您能够从我们这里获得良好体验。对于上一屏幕中的类别,如果选择“是”,我们将收集并使用您的数据以自定义您的体验并为您构建更好的应用程序。您可以访问我们的“隐私声明”,根据需要更改您的设置。

个性化您的体验,选择由您来做。

我们重视隐私权。我们收集的数据可以帮助我们了解您对我们产品的使用情况、您可能感兴趣的信息以及我们可以在哪些方面做出改善以使您与 Autodesk 的沟通更为顺畅。

我们是否可以收集并使用您的数据,从而为您打造个性化的体验?

通过管理您在此站点的隐私设置来了解个性化体验的好处,或访问我们的隐私声明详细了解您的可用选项。