Description
Key Learnings
- Discover Autodesk Validation Tool for automating model validation, quality assurance, error minimization, and the acceleration of planning processes.
- Learn how Exyte successfully uses Autodesk Validation Tool to ensure the accuracy and efficiency of project implementation right from the start.
- Learn more about Autodesk Validation Tool software's real-time dashboards, which provide a full overview of model integrity in megaprojects.
Speakers
- Tineshwaran SellathoroeTinesh is a dedicated BIM enthusiast with over a decade of experience in the AEC industry. He currently serves as the BIM Regional Head at Exyte Malaysia and holds the position of BIM Lead for Exyte's Data Center (DTC) business unit in the South East Asia (SEA) region. With a profound passion for design, construction, and technology, Tinesh is recognized for his pioneering role in digital transformation, advocating innovative approaches in industry. He began his career at one of Malaysia's largest construction firms, where he successfully developed and implemented a visual-based planning platform. This platform streamlined daily lean job cards and comprehensive reporting, marking early achievements in his career. Tinesh later advanced to champion virtual building performance analysis applications in an MEP consulting firm, significantly enhancing project efficiencies. Here, he also developed a centralized repository for aggregating and managing engineering and design data from past projects, leveraging it to define standards and act as a dynamic reference tool for upcoming projects. In his current role, Tinesh led BIM standardization across multiple high-profile projects for renowned clients. Notably, he has successfully overseen and delivered the BIM scope for cloud/hyperscale data center project. He spearheads efforts in digital standardization and implementation across Exyte Malaysia and the Data Center (DTC) business unit in the SEA region. Tinesh is a chartered Professional Technologist in Malaysia and holds numerous certifications relevant to BIM. His expertise spans across diverse project types, including residential, commercial, data centers, and specialized high-tech facilities. In summary, Tinesh brings extensive experience across the entire BIM lifecycle, from space planning (pre-tender) to facility management (post-construction). His strong skill set and passion for driving digital transformation underscore his commitment to advancing industry practices.
- Martin LangolfTechnical Account Specialist for the EMEA region, working at Autodesk for over 9 years. Certified for 2011 AutoCAD Product Family, with over 10 years of professional experience in the Architectural and Engineering area and many projects true out Northern Europe. My main goal is to ensure that the work is completed to the highest possible standards and to all parties satisfaction. No matter what the challenges are! Specialties - English/German and Russian technical support, challenging difficult situations. Product knowledge - Autodesk Live, Revit, Navisworks (basic), AutoCAD Architecture, AutoCAD MEP, AutoCAD, AutoCAD LT, DWG True View, Design Review, Licensing, Installation, Deployment, Windows, Mac OS, Customer Service, Customer Success, Presentations, Webinars, Engineering, Architecture, CAD, Software, Hardware
TINESHWARAN SELLATHOROE: Hello, guys. Welcome to our case study sharing. We are looking at automating and improving Revit model quality with Autodesk Validation Tool. Now, this is going to be by myself, Tinesh from Exyte, and my good friend, Martin, who's from Autodesk.
A quick introduction about myself. I'm Tinesh. I am from Malaysia, and am I am the BIM Regional Head in Exyte Malaysia. And I also lead the BIM efforts in Exyte's data center business unit for Southeast Asia region. I have over 10 years experience in the AEC industry. Currently, I am focused on digital standardization, R&D, and, of course, the implementation of BIM across Exyte Malaysia, again, more focused in data centers currently.
I've done various types of projects throughout my career, ranging from residential, commercial, high-tech facility, and data centers. I've experienced a large part of BIM lifecycle from various stakeholder perspective, all the way from space planning to facility management. So that's pretty much about myself. I'll pass it to Martin to introduce himself.
MARTIN LANGOFF: Thank you, Tinesh. Hello, everyone. My name is Martin Langoff. I'm a technical account specialist working for over 12 years at Autodesk based in our Munich office in Germany. And my main focus is to help our EBA customer to adapt our AEC solutions and be much more efficient in their daily business. Thank you, Tinesh. Back over to you.
TINESHWARAN SELLATHOROE: Thanks, Martin. Right, a quick look at the agenda for today. We are going to look at the introduction of the company where I'm from, Exyte, and a specific project, which made us look at how we can improve the way we do things. From that project, we will talk about issues we encountered and how we solved it. We will, of course, talk about the solution itself, which is Autodesk Validation Tool. Martin will then talk about the check sets and custom solution that was developed. And I'll top it off by talking about the result and summary.
Right, about my company, Exyte. So in Exyte, we believe that we create better future by delivering high-tech facilities which then enables the client to enhance the quality of modern life with their product and services. So this is us. We have over 100 years of history. We recorded over $7.4 billion euros of sales in 2022. We are over in over 20 countries worldwide, and we have about 9,700 employees worldwide. This was recorded last year.
And this is our German engineering heritage. Over the past 100 years, we started off as M plus W in 1912, and we were rebranded as Exyte in 2018. And ever since, we never stopped growing. Exyte Worldwide, we are here in US. We are in continental Europe. We are headquartered in Stuttgart. We are also in Northern Europe and Israel. The headquarters is in Ireland. We are also in India. We are in Northeast Asia. We have a lead office in Shanghai. And, of course, we are in Southeast Asia. The lead office is in Singapore. And, of course, where I'm from, Exyte Malaysia.
Now, let's get to the business. Have you been in the situation? Have you had large number of models and revisions, maybe uncommonly large? You have too many cooks in the kitchen, which is often the case. But maybe when the project is large, you just can't help to have large number of trade partners. You have stringent requirements by your client-- I would say smart client, which is very common these days, which is good because in the project, we know where we are heading towards. But it could be bad when they just want everything. And lastly, have you been facing frequent quality issues with your models?
Now, we face all the issue that we mentioned-- I mentioned earlier in this particular project, which is codenamed Project Bird. So I'm going to give you a quick introduction about Project Bird just to share the immense scale of it. So this Project Bird, it's a high-tech facility construction.
This is a first overseas facility for advanced 3D chip packaging. It is one of the largest chip packaging facility in the world, and it is the largest one in Malaysia so far. This particular project was valued over $7 billion or $30 billion ringgit, Malaysia. We had over 150 trade partners in this project during the peak, and we also had over 700 models in this project. And all these models are coming from our suppliers, subcontractors, nominated subcontractors, and so on.
The issues that we face. As I did mention, the project involved numerous parties, a large number of it. And of course, that means a large number of models, which cost us complicated coordination and quality management. Some of our trade partners or subcontractors, they were using and relying on downloaded content, which was contributing to the inconsistency. And it was proving to be a risk given that the client was actually looking to take this model all the way to their facility management platform.
Our BIM team in the project, believe it or not, despite being big-- but due to the number of models and the trade partners to manage, we were short handed. And as we were looking into many things to do-- in terms of completing the construction model, ensuring clash coordination is being carried out, ensuring the shop product shop drawing production is on time, and many more things-- we were actually lacking resources to implement a comprehensive quality control measure. And again, without a quality control measure, it means there is no quality control gate. And we have all these sort of creeping quality issues which was visible all the way to the client.
So at the early stage of the project, we were swamped with excessive quality issues. And it was proving to be a challenge to even identify or monitor and rectify these issues. So we were at a stage where our models had data integrity problems. The models were carrying data-- informational data that's not accurate and outdated. And then, of course, all this led to declining trust in BIM as a single source of truth. And these issues were undermining stakeholder confidence on Exyte.
So we identified a solution which we hoped would help us solve this issue we were facing because we wanted something that could help us without requiring much effort from the team-- something that could address the situation that we are in, help to resolve the issue that was ongoing. So we explored, and we looked at Autodesk Validation Tool as our solution.
Now, a quick introduction of what Autodesk Validation Tool, or AVT is, and what are the capabilities. So AVT is a fully-automated cloud solution. It takes advantage of cloud computing power to analyze Revit models very quickly as per your requirement. This mainly helps to check the informational data of your model.
In AVT, you have customizable or preset check sets. So check sets are basically what you are validating your models against. So this can be, against a preset check set which is already in the system, the standard one that you can just use. Or you can further customize this check and then decide what you want to validate your model against. So Martin is going to explain this in detail later.
Now, with AVT, you get to speed up your model error resolution process. Can you just imagine you're opening up a model, reviewing it, and then manually checking it for some informational data accuracy, eventually saving it, and then closing it? It could be quick if the model is very small. But what happens if the model is large? And what happens if you have a large number of models to do this? So you know that this consumes a lot of time. So we used AVT to cut this down, where we automated the checks we wanted. And we also use the data that was generated for visualization on custom dashboards. We're going to look into this later.
And when we used AVT, we had a sort of assurance of the model's performance and integrity. We knew that whichever models that passes the checks is reliable and good to go. And whichever doesn't, we know exactly what is wrong and what needs to be corrected.
And lastly-- you're going to hear this a lot-- AVT is a very scalable solution. We used it on a very large project, and we know this would function the same no matter what the project size is and no matter how many different types of projects you've had. You set a template, and then you roll it out to all the projects that you want at a go. Now, let's take a step back for me to explain to you why did we decide to implement AVT, why we thought AVT we would be able to solve the issues we had, and what was AVT offering beyond that.
Now, to start off, AVT was repeatable. It is a fully-automated workflow in cloud. For the reason I mentioned, and due to the large number of models, we wanted a solution that would do the work for us. AVT is scalable. We saw AVT as our quality control gate. We wanted a solution that would allow us to monitor the project metrics.
We wanted to use the data to look at trends over time, and we wanted to know if the issues are on a downward trend. And now that it's being addressed and being communicated, we wanted to increase the model performance, again, going back to solving the issues we faced. And, of course, to save time and money. This is as we cut down the time needed for our resources to carry out manual jobs and as we identify the error ahead. So instead of hiring another person to just purely carry out the QC works, we decided to just hire AVT.
Now, we had our specific requirement for the model itself. We wanted to check very specific things. Initially, we were using presets or standard check sets that came with AVT, the common ones, which are called Revit Best Practices-- Martin will explain to you. These are the ones that we're using to validate our model against.
But as we were seeing how much this was helping us, we were curious to see if the checks can be done on our specific requirement. And we believe these are some key areas which require attention. So I'm just going to quickly go through a few.
First was the file naming standard down to the basic. We wanted to know if the models are being named correctly. As basic as you think this may be, you all would know how important this is. We had a naming convention which was complex to a certain extent to identify the models. And handling a large number of models, it was never easy to check all one-by-one.
Location, project, location settings. As we have various stakeholders from different part of the world, despite having all sorts of information and templates being provided, this was still being an issue. So we wanted to make sure that the project control points are accurate, as our shop drawings are-- and the construction drawings are actually being produced from the model. The coordinates were actually being generated from models. Hence, it was very crucial for us to have this correct.
And of course, many more project-relevant metrics and parameters that we wanted to check. Some includes model composition, the imported contents, in-place families, any sort of 2D contents. And we wanted to check the view compositions. We wanted to know the number of views that are there, but not on sheets. So where are they there, the governance aspects of it-- we want to look at the errors, the warnings, whether you can purge this model, and even family-related information, such as names, system types, and others.
And lastly, our requirement was, again, to have something that'd able to check all our models, all 700 models to all our subcontractors. And most importantly, monitor them in terms of their model status trend.
So this is what we needed from Autodesk. Validation Tool. We wanted to carry out model reviews-- some very specific checks like I've shared earlier. And we want this to be carried out at a very fixed frequency automatically. We didn't want to prompt the checks manually, and we didn't want having to go through every single one model of ours and then prompting a check. And, of course, we wanted this to be done to all our models.
We know the standard check sets, and we were using it. But after using it, we have the need for more. We wanted to customize it. Also, we wanted to learn how to customize this. So we wanted to generate reports. We wanted these reports to be centralized and accessible to all relevant stakeholders. But we also wanted a live dashboard to monitor the subcontractor's performance, to understand the model health, all from the data and the information that's being generated by AVT.
So we wanted to bring this data out from the reports and put it on a live dashboard that's being updated frequently. And of course, we reached out to Autodesk for technical support and consultation. Now, I'm going to quickly touch on some basic practice with AVT, and then I'll pass it to Martin to share some customization and integration that was done in Exyte.
So AVT interface, it can be accessed online from avt.com. Bear in mind you first have to authorize the app from your CDE, BIM 360, or ACC, which allows you to view the models that start in the AVT interface. From there, you don't have to download the models, or even don't-- you don't even need to have Revit. And you can run the checks from the models that you want from the selected CDE.
Again, here in AVT, you can schedule checks. You can do it now, or you can do it on a certain interval, or you can do it as someone is publishing the model. So in this platform, you are also able to download the reports directly or store it into your CDE.
So the check sets that are available here, you can customize and you can create it in this interface. And it will also be available in the model checker interface, which I'm going to show next. You will have the same library of checks from the model checker. There are some sample Power BI templates which you can also download and visualize the data that you download from here. So if you think the report is boring, just use the information, use the Power BI template, and then translate it to something that's more interesting.
And again, this allows you to run checks for all models at scale. And in a way, this is actually a very summarized way of using AVT. There are many resources online on how you can do this. So we can always have a separate course or session if you need to implement and use this solution at your workplace.
While the usage of AVT is mainly in the cloud interface, as I said, there is also a model checker plugin under the BIM Interoperability Tool, which allows you to carry out similar actions that you could do from AVT online. So from this plugin, you can, again, edit check, set, customize, or create a check set from scratch using Configurator. It's an easy visit, which helps you through the process. Martin is going to explain in detail after this.
And from here, you can also, again, automate and schedule the checks similar to what you can do in AVT online. And once you have generated a report from a check, be it manual or scheduled check, you can view the reports or you can share it and let the owner of the model view the report locally by launching Revit, running the model checker plugin, and running the shared report.
So this plugin then shares the report-- or certificates, as we call it-- of the model, detailing what check was done, and what was the error, and viewing the relevant element, which is tied up to the error. So you could choose to either do it on the online interface that I showed previously or via the Model Checker plugin. It is similar and it offers the same functionality.
As mentioned in the previous slide, the reports generated are shareable. You can download a file format, which is stored AVT. And you can send it to anyone who could be working on this model. And they can open the Model Checker plugin and view the report. This is very useful, as they can clearly view which element requires attention, and they can work on the correction based on that.
So apart from having a dashboard which tells them, your model needs correction, this report actually tells them what needs to be corrected. Of course, viewing the model itself, you can't do it on AVT online, where you open and view the model and see the element that's affected. But you can do it on a Model Checker. It becomes very useful. And trust me, this is something that we've experienced before in having a clear communication with your trade partner.
So that summarizes my part at the currently, I'm just going to pass it to Martin to show how we customize this solution further.
MARTIN LANGOFF: Thank you. Tinesh. So yes, as Tinesh already mentioned, the out-of-the-box check sets, they were not sufficient enough for the use case that Exyte had on this project. So we had to customize it even further. And based on the requirements that we received from Exyte, we helped them to achieve certain customizations, to identify certain parameters that may have not a value set. And of course, we would also be able to customize or integrate these custom check sets, and then visualize the results and the reports that can be distributed to all the stakeholders involved, to all your designers.
And in order to easily identify the elements that are having this issue or the missing parameter values, by simply selecting it from the report, the little magnifying glass, and within Revit, you will be zoomed in to the right view. And the element affected will be already pre-selected. So it is much, much easier for the end user to correct the issue in the model. Next slide, please.
Also AVT in practice. So by default, as well as inside the Model Checker, we are able to output two main file formats, which is the HTML report. That is a very similar report that can be distributed via the AVT file. And also the Excel reports that can be used to feed a Power BI dashboard in order to visualize all those results. But you can, of course, also do your analysis within Excel directly.
So out of the box, we already provide, with AVT, a large amount of predefined check sets-- for example, the Revit best practices that I will touch on in just a moment-- and already pre-configured Power BI dashboards that can help you to visualize those reports. So you can have a very easy and understandable overview of the problematic areas. And in the process of correcting those areas within your Revit models, you would be much more efficient. So we can go.
So let's speak about the check sets, which is basically an XML file that is used in order to create the rules that you are using to validate your models against.
So here, we can see a large list of the predefined and publicly available check sets. This is something that we provide with the Autodesk Validation Tool, and we are offering over 30 different check sets that you can basically use out of the box, and you can get started with AVT right away. So if we look at the example checkset of AVT best practices-- so this particular check set focuses on identifying key areas, like the model performance and any potential issues with it. So let's say the amount of warnings or the purgeable elements.
You can also identify any information of room spaces and areas within your Revit model. So, for example, how many of each exist. You are also able to check any view-related information. So how many views exists within your model, are there any schedule sheets or scope boxes created, and so on. And you can also validate element information. So are there any elements placed on wrong work sets, and this type of information. So are there any duplicates present within your model?
So all this can be validated with the Revit best practices check set right away. So if you are looking forward to get started with it, you may use this check set right out of the box and validate your models for this information.
And also, if you would like to create your own check sets, the Configurator within the interoperability tools offers you different options. So you can use the wizard, which basically is an option that will guide you through the process of creating a custom check. And you can use, actually, this particular example if, for instance, you would like to validate the custom parameter fire rating within doors, and you would also like to validate its value. In this example, we are using the value 30.
And at the same time, if you would like to compare it to the host element, you would be able to include-- by following the step-by-step guide, include this additional condition to your check set to validate also the fire rating parameter within the host element. And if those values do not match, it will show up in the report as a result, positive or negative, or even neutral, like a simple list of results. If you have multiple issues identified, that can be, again, easily distributed to your end users and corrected within the model.
And for the more advanced users who potentially have already created some custom check sets, you can also use the Advanced Check Builder that allows you to work with operators, criterias, properties and different conditions in order to build a check rule that can help you to validate certain information.
So in this particular example, we would be validating a custom parameter called-- let's call it Catalog Item Number. And we want to validate its value, that it is set to a specific custom code. So here, you have a little snippet of a custom code that is based on a naming convention.
And if the parameter value is set to a value that doesn't correspond to this custom code, it will basically generate a negative result that will affect your Revit model health score. And via the report, you can, again, also distribute those reports to your end users that can easily identify the parameters or the elements with the parameters set for wrong values and easily correct it within the Revit model.
So at Exyte, we had to develop a custom solution in order to deal with the very large amount of frequent Revit model updates, and also with the large amount of the reports that were generated by AVT. So for this, we have actually developed in the custom domain a fully automated solution where all the different project stakeholders, they are frequently updating the Revit models in the Autodesk Construction Cloud in the CDE, where all the project-relevant data was hosted. And every time a Revit model update occurred, AVT would automatically get active and validate those models against the custom check sets that were developed for Exyte.
So once the models have been validated, it will, of course, generate a very large amount of the different reports. And in order to deal with this amount of reports, we actually had to integrate those reports into the Microsoft Azure Data Factory using an ACC Connect recipe, where we were able to connect to the AVT webhook and transfer all those reports into the Azure Data Factory.
In the Azure Data Factory, the reports have been manipulated, modified in order to create a single report that can be fed into the Power BI dashboard. And this Power BI dashboard can also be integrated back into the Autodesk Construction Cloud in Insights. And every stakeholder would be able to use this dashboard in order to identify the problematic areas.
At the same time, they could consume the AVT file inside Revit and interact with that report directly within Revit. And of course, as more model updates occur, the Autodesk Validation Tool will re-evaluate all those models. And basically, we are closing the cycle of updating the existing reports, feeding the Power BI with the new information, and make it available on the project for all the stakeholders. So this was basically the quality gate that allowed us to proceed and deal with all this, a large amount of information in order to speed up the construction process.
So how did we do it on-- within the customer domain? So we used the Azure environment in order to deal with all the incoming Excel sheets that are stored in the storage account of the customer. And we use the Data Studio in order to move all this data into SQL tables, in order to generate a single report that can then be used in Power BI in order to publish all this information and make it available for all the different stakeholders.
So we have used the data sets and-- in the Data Studio of Azure. And there, you can basically define the data sets that you need. And you are able to manage many different file formats and file types. Also, you are able to create a data flow that allows you to perform certain manipulations of the data, like joining the data, splitting it up, or even aggregating the data by using many schemas that are available.
And you would also be able to define a pipeline that can be used with the different data sets and data flows, and provides you many possible actions, loops, and decision possibilities that are very useful in order to work with a large amount of data. And you can also set up triggers. So whenever something in the pipeline, the trigger would basically occur based on the different events, time, or even storage changes. Also, when you are developing a solution like that, you can also trigger it on demand in order to test your solutions and see what is currently in the pipeline.
And after the data, or after your pipeline or your workflow has been created, you, of course, are also able to monitor what is happening with the pipeline. This can also help you to ease up any analysis that you may need to perform when errors occur with the data. So this is a very useful and helpful environment in order to deal with a large amount of data.
And the result of all this was basically the custom-developed dashboard that Tinesh will speak in just a moment more about this, and that allowed everyone to visualize this data and to have, basically, call to actions generated in order to address the problematic areas that were identified. And, Tinesh, back over to you to speak more about the efforts and results that you are getting from this solution.
TINESHWARAN SELLATHOROE: All right. Thanks, Martin. Now, allow me to just quickly run through some of the results of our effort. So we generated dashboard, as what Martin has mentioned, using Power BI. So this is a very simplified version for the sharing sake.
The purpose was we wanted to create a dashboard that allows us to filter the result of checks on weekly basis or on the interval that we select by suppliers. So this helps us to understand the state of the model, if the version that has been shared or published is any good. And of course, we get to also do a sort of benchmarking between contractors or between the trade partners who's involved. Yeah, we are able to review, monitor each and every subcontractors that's involved in the project.
So we were also looking at trends to see if the issues are on a downward trends now, that is being communicated. Or if it's not being addressed, and if the issues are increasing-- so that then allows us to create a specific clinic sessions with particular subcontractors to understand what is the issue and working together in ensuring the model has the proper quality. And this information also allows us to use the data as a performance indicator over the long run. So by the end of the project, we actually know the subcontractors which will be improving, subcontractors who did not, or subcontractors who actually got worse.
And another option was also to filter and create a detailed view by subcontractors. Again, this is more useful to understand what exactly the error is. We often use this in meetings, in workshops, or in the clinics that I've mentioned earlier.
We've also embedded the dashboard to our ACC insight. So these are among the first thing that everyone in the project will see when they enter ACC. So yeah, it's more about being transparent and more about saying that there are certain issues that you have to address, and addressing it is very important. And yeah, you have to work on it.
Right, and some of the result. This is a quote from our client who was with us throughout the journey, who witnessed the improvement, who was also being very supportive. And they were sharing how it supported their plan to further use the models that we have created, the information that we are compiling into the model into the FM platform. So now you understand how severe it is not to have any information that is not relevant and how important it is to have updated and relevant information being tagged into the models.
I'm going to give a very quick summary of the solution that we've discussed, a solution that we were practicing, and, of course, a solution that Martin showed. In general, from our perspective, AVT can be used in two different workflows. So if you would like to keep it simple, you're a basic user, and you're happy to just have the tool to remove the manual work and reduce the time that's being spent to QC models, you can use the first workflow.
It consists of using AVT or Model Checker for manual or automated schedule checks. You can rely on the standard or the preset checks set, which is pretty good, to be honest. We were relying on that very early on. And you can choose to view the report either online, or on the Model Checker, or download it and use it on an Excel as you wish to carry out any sort of correction, or convey this information down to the relevant parties to do it.
So the second workflow, I would say this is slightly more detailed, would then, again, be carrying out the checks on AVT-- again, on AVT-- sorry, on Model Checkers for the manual or automated schedule checks. You can use presets. Or now you can take it a step higher, and you can create custom checks set. Either you work on a standard check set, you customize it, or you build it from scratch as what Martin has shared.
Then the next one, you can view the reports to download it. But you can also take it to another step where you can use the available templates that has been shared by Autodesk to view these reports on Power BI. Of course, a better visualization, which means better understanding for people or stakeholder who's involved.
But now, let's say you want to harness the data further and you also have a large number of models to review and monitor. So you don't have time to go through one-by-one. So what you do is that you do what we did. You use the Connect recipe and the webhook to pull the data from ACC environment to your very own domain, as shown earlier. And you can use the data to create a custom dashboard that suits your needs the most. It can be any sort of dashboard. It's all up to how you're going to manipulate and display the data. And yeah, that pretty much is the summary of how you can use AVT and how you can capitalize AVT's capability.
Now, to recap our overall presentation today, it all started with a challenge, which was the Revit model that was provided to us by the subcontractors did not meet the expectation or did not meet the requirement. This then led to extensive revalidation effort by our BIM team, and there was no capability to validate specific parameters and their values automatically. But of course, this can be done manually, and this was consuming a lot of time.
So the solution, we started using AVT's basic function to QC model. We were using the preset check sets. We were downloading the reports, and then we were sharing to the relevant parties to fix it. And all we wanted was to automate this, and we wanted to customize the checks that we can do.
We had consultation and support sessions with Autodesk, and we had coaching sessions with the entire team. We then further collaborated with Autodesk, which allowed us a creation of a fully automated solution which validates the Revit models that has been published, and then supports customized dashboard.
So then we also develop custom check sets as per our specific requirement, which I shared earlier. The Power BI dashboards were set up to visualize the data validation results, and we also embedded that into Insights. And the result, with AVT, we were able to detect the model status at one go for all subcontractors, and all-- for all the models. That's the most important part of it. And all these custom checks sets actually allowed a validation of specific parameters within Revit models. This is what we couldn't do earlier, which we were doing manually.
And the results are now visualized in custom Power BI dashboard, as shown earlier. And it is now being communicated to all subcontractors. All these identified issues are being let known to everyone who's involved. Yeah, and this was helping to expedite the resolutions.
We also had real-time access to project metrics in the dashboard in ACC Insight, as well, like I said. We were able to ensure model integrity now meets what was expected in the project. And this scalable solution also allows us to maintain and deploy the workflow to any other projects.
Now, I'm going to share an approximate return of investment from AVT implementation in Project Bird. Bear in mind, this was actually done with a certain level of assumption. Right, so we cataloged over 2,000 issues, and we priced each and every issue, say, at about $1,000 average per issue. And we then estimated over 70% savings, which brought us a saving of $1.4 million USD.
We reduced manual labors for the reasons I mentioned earlier, and we saved a lot. And this is estimated between 10,000 to 20,000 hours, which then comes up to $1 million US dollars. And with an estimated delay and inefficiency reduction would say $10 million. And we were modestly saving, say, 20% of it, and that was about $2 million, which then brings up the total estimated savings to approximately $4 million US dollars.
So this is, again-- of course, it's an estimated RoI with a few assumptions here and there. But we look forward to you trying it, and maybe capturing a more accurate RoI. And hopefully you could then share this to us one day.
I'm going to summarize by sharing the future of Autodesk Validation Tool in Exyte. What is our future plan? We're looking at all upcoming projects in Southeast Asia region to first adopt AVT a standard with plans for a global rollout. The AVT system, again, is designed to be scalable. So it allows seamless implementation across projects at-- in one go. A standardized-- And we actually have a standardized and well-documented AVT implementation. So for us, it's almost as simple as plug-and-play solution that can be integrated into any projects.
But believe me, using AVT is very straightforward. And at the end of the day, it will be a plug-and-play solution whether you have or you don't have a standard. But of course, it's always good to have it. And AVT will now be utilized across all our data center projects. This aligns with the demand of this high-speed project delivery. So it's not only complex, or a project that has high number of trade partners, or high number of models. But this also becomes very useful for a project that is expedited.
So, as all of you may know, the data center projects are always accelerated project, fast pace. So we believe AVT would fit in perfectly in this sort of projects. And of course, we will not stop customizing on what we can check and how we can exploit the data that we can get from AVT.
Right, and that wraps it up. Thank you again for watching this, and I hope you guys learned something useful. We look forward for you to implement AVT in your very own workspace, very own projects. And reach out to us if you have any questions or any queries. And we will help however we can. Thank you.