설명
주요 학습
- Learn about single source of truth, its associated challenges, and the benefits of successful implementation.
- Learn how to implement best practices to create and maintain single source of truth using Autodesk Docs, Autodesk BIM Collaborate Pro, and Revit.
- Go beyond the usual single source of truth implementation by exploring the innovative data-compliance-focused approach.
- Learn about harnessing Autodesk Construction Cloud, Revit, APIs, and other tools to ensure data consistency and prevent BIM execution plan incompliances.
발표자
- Mateusz LukasiewiczMateusz Lukasiewicz has over 12 years of experience in the AEC industry, and throughout his career, he successfully led digital delivery of large-scale projects and developed a number of modern digital engineering solutions by combining BIM expertise, computer programming skills and project management principles. Mateusz undertakes a vital role in driving company's clear vision towards achieving the leading digital innovator position in the market and its long-term digital capability goals.
MATEUSZ LUKASIEWICZ: Hi, everyone. Thank you for attending Autodesk University 2022, and welcome to my class about Single Source of Truth and Data Consistency using Autodesk Construction Cloud, Revit, and API.
I will start with introducing myself. My name is Mateusz Lukasiewicz. I'm Digital Projects Analyst at KEO International Consultants based in Dubai, United Arab Emirates. I have over 12 years of experience delivering large scale projects using BIM, computer programming, and project management principles.
Session introduction. This is the official description and learning objectives which also formulate the agenda for today's session.
We start with understanding definition, importance, challenges, and benefits of a single source of truth. Let's start with asking question, why do we talk about seeing a source of truth and data consistency? Based on Data Advantage in Construction Report released by Autodesk in 2021, it is estimated over $1.84 trillion bad data cost in construction in 2020. The number is estimate only. However, it shows that the construction industry is facing a significant problem with data quality.
Now as we understand the problem and the reason why data quality and single source of truth are important, let's talk about definitions. I have identified two main descriptions that can help us in understanding the concept. First one, in plain language, it means same approved data used by all stakeholders to make informed decisions. And this is why we implement ISO-19650 standards working practice shared and published areas in common data environment to control data exchange.
Second technical definition can be summarized by ability to update by reference, and this is where a single update affects all references to the object. For example, change applied in Revit model element, let's say world thickness, affects associated views, schedules, volumetric data, and tags.
Knowing definition, let's move to benefits, qualities, and challenges. Following benefits can be observed while using single source of truth-- reduce mistakes, errors, and rework; meet schedules and budgets; improve communication and quality; achieve better decision making; improve trust and transparency; and reduce the risk of actions using wrong data. In terms of qualities which also poses challenges, single source of truth should be accessible, up to date, standardized, and integrated with other systems. Also, the most important, it should be trusted by the project team.
We discussed the basic theory behind single source of truth. Now let's move to practical implementation. In next few slides, I will talk about out of the box Autodesk Construction Cloud and Revit functionality to create and maintain a single source of truth. This section contains rather entry level information. However, these foundations will be essential for custom solutions using Revit and Forge APIs that will be shown later on.
Use Autodesk Construction Cloud as common data environment and apply ISO-19650 folder structure. Cloud based common data environment correctly created for the structure and applied access permissions are the key components of collaboration platform.
Worksharing central and local models. Worksharing functionality allows teams to work on the same model at the same time from any location. On the image on the right, we can see the relation between central and local files. Upper left image shows published central model in Autodesk Construction Cloud, and below image shows the local model cache and how models UID can be obtained via Forge API, which will be used later on in one of the examples.
Using BIM Collaborate Pro and Design Collaboration module is a great way to share, consume, and track data exchanges between design teams. It ensures transparency and streamlines work in progress to shared area data transfer within the common data environment.
On this slide, I would like to highlight two points. First one is project requirements that should be stored in the common location accessible by the project team in Autodesk Docs. And by project requirements, I mean BIM execution plan, shared parameter files, and project BIM configuration structure data. This is the concept that will be explored later on.
The second thing is Parameter Service. This is the brand new tool available as tech preview in Revit 2023, and this is used to store shared parameters in cloud, rather than in text shared parameter file.
Autodesk Construction Cloud offers multiple tools that can be used to support common project activities. Implementing available tools enable teams to streamline processes, improve the quality and transparency, and also to minimize work outside common data environment. You can see on the slide, there are a number of tools that can be used by the project team, and each of this tool can be a topic for the separate session. If you are interested in getting more information, please refer to the session handout for learning resources references.
Previously, we learned the foundations of using Autodesk Construction Cloud for creating and maintaining a single source of truth. Now we are ready to explore more advanced methods.
First, let's have a quick look on the most common tools interacting in Autodesk Construction Cloud and Revit. The most common are Revit, Dynamo, Autodesk Forge, and Power Automate. All of these tools can be used to automate processes and manipulate the data in Revit and in Autodesk Construction Cloud.
Typically, while speaking about automation tools for Revit, we imagine scripts running inside authoring software after hitting the button. I would like to highlight alternate ways of executing scripts and the way it interacts with users.
In terms of environment, those can be done outside model altering software, execution may be scheduled or happen on triggered event, and from a user perspective, user can be alerted on noncompliance or be unaware of process running in the background.
Before we move to showcase, I'm going to introduce a few techniques, methodologies, and concepts that will be used in practical examples. First, event driven programming, which allows interaction with data in response to user activity. We have 70 plus available events in Revit API. The second is the structured data project requirements.
We are using organized structured data to set as an input to automation and validation tools. It will be also shown later on how to create the structure of project requirements. And the third one is functionality blocking. In certain cases, the quality can be achieved by blocking unwanted functionality in the project.
Finally, we are moving to the most interesting part of presentation, to see the power of how Autodesk Construction Cloud, Revit, and APIs in action, and how the platform and concepts introduced earlier can be used to ensure that the consistency and prevent BIM execution plan incompliances.
The first short video will go through setup required to run all the tools. You can see Autodesk Construction Cloud project with project requirements stored in the common location. ISO-196050 compliant folder structure, we can observe work in progress certain published areas within the CD.
Additional services activated for project members, such as design collaboration and model coordination. Design collaboration teams created for data exchange. Created Forge application and registered in account admin. You can also see some custom Revit plugins installed, and also Autodesk Desktop Connector.
I'm showing now BIM execution plan. This is the standard document that is the key for each project delivery. And the takeaway of this part of the video is that we would like to replace unstructured data in BIM execution plan with reference to structure project requirements database. And in simplest form, this database can be a spreadsheet stored in Autodesk Construction Cloud. However, it may be other solution of your choice.
So in this particular example, we are trying to replace this image of project location base point and SharePoint-- SharePoint, with references to a spreadsheet that contains data in more structured format as this data will serve as the input for various tools.
So this is the example of structured data requirements that were captured in project configuration file. It may contain any data that was agreed with the client and that needs to be validated throughout the project delivery. Can now see project location-- project information requirements data, list of models, sheets, project location, naming system requirements, required parameters.
And again, this is just an example for the sake of the demonstration. Obviously on the large projects, this table would be way, way longer.
One thing that I would like to highlight on this slide is the data restriction column, which basically specifies what are the expected and allowed values for certain data. So we can see range values for sound transmission class, and also specific values for fire rating.
In a similar manner, we can capture any other data. And in addition, we can capture not only the project requirements, but also this file can serve as optimization tools configuration files, so we want to control the behavior of the tools based on the settings specified in this file.
We can see project requirements structured data stored in the common location. This is the backbone of our project delivery, this is the data that is accessible to the entire project team, and this is the single source of truth of project requirements that has to be fulfilled by the team.
The last part shows shared parameters loaded in the project. We can use either the Revit parameter service, Revit 2023, or shared parameters text file.
Once the setup is completed, we can now move to examples of using Revit API.
We start with functionality blocking. During this demonstration, we would like to override a CAD import command to prevent bad modeling practices, and also validate the Revit family data source to block any non-compliant content.
In the first part of the video, I'm demonstrating that the behavior of scripts can be controlled from project configuration file. Basically what's happening now, I'm specifying that the family content can be loaded only from approved KEO content library, and also that CAD import command should be blocked.
I'm saving this file. You can see that everything is happening directly in Autodesk Construction Cloud. We saw an updated version of document overwritten in the platform, and now we are attempting to import CAD.
We can now observe that once user prompt to import CAD, there is a notification that prevents user from using this command, which is the expected behavior as we want to block this functionality as it might be not allowed by the client.
In the second example, we are navigating to a folder that contains non-compliant families-- so basically, we are trying to load Revit family from outside KEO approved library. And again, in this case, we received a notification that the family loading is cancelled, and the reason for that is incorrect source location. So by using these tools, we can easily control the content in the model and also prevent any bad modeling practices, such as using import CAD command.
In the second example, we'll talk about metrics export on model closing event, how to store data, and how to visualize the data.
On the screen, I'm showing metrics dashboard that is embedded directly in Autodesk Construction Cloud Insight module. In this case, this is the model of model health dashboard, and this is based on the data extracted from cloud models based on the benchmarks and factors specified in the project configuration file.
On the right part of the screen, I'm showing metrics that are stored in Autodesk Construction Cloud, and this is important because the data that is displayed in Power BI dashboard comes from ACC. This is not offline data from someone's local machine.
Now I'm going to attempt to close the model, and we received the notification that model health metrics have been exported successfully. Basically, the way how it works, there is an event that triggers to export the data from Revit. Can now observe the data override in Autodesk Construction Cloud. Based on approved data, we are able to refresh data source, publish Power BI dashboard, and send directly in the ACC.
We can now see the updated dashboard.
Another note on model health metrics, we may not want to wait until models are closed or someone refresh the data source in Power BI and publish the report. In some cases, we would like to see the live feedback. So there is actually functionality to display data live directly in Revit, not only for currently open document, but also for all documents, including linked files.
The next example will be about parameters and naming compliance on sync event.
I'm navigating to configuration file to show expected naming convention and parameters requirements. We can now observe that the family dynamic system is well defined, and also it's currently-- it's filed in the currently open model, and the same for the project requirements. In this and few other examples, we'll focus on sound transmission class and fire rating, so we can stop for a second look on the expected values for fire rating and sound transmission class. We can see that the requirements have followed the model. Now we are attempting to synchronize the model, which is happening without errors, and this is sort of expected behavior, to be able to synchronize the correct model.
Now I'm going to simulate applying some incorrect changes in the model, so I'm changing the values outside the given ranges or acceptable values as I'm changing one of the approved families to one of the non-compliant families.
Now we're trying to synchronize the model. We'll actually receive the notification that there is noncompliance. It provides detailed information about the non-compliant element, including element ID and non-compliant parameter values. So this is the first example for the family that is not compliant in terms of parameters data. And the second example of family naming in compliance for the family that was loaded outside KEO family library.
The interesting part about this example is not only the notification shown to the user about the incompliance, but also automatically we are cancelling the sync operation. So basically, we are preventing users to synchronize non-compliant information to the central model. And this model will not be able to be synchronized with central file unless we fixed these two errors. And again, just side note that this is only a small demonstration about these two parameters plus naming system compliance, but obviously we can validate any other data that is defined in project configuration file.
I'm now applying corrective actions in order to synchronize the model.
So this was the example of model validation at the sync event, and we can now extract data through this event. So if we go back to the previous example, when data metrics were extracted at model close, we can also do it at model sync. So we will have more updated data exported daily.
The next example is about model updates. We'll explore three ways of updating the models and ensuring the compliance.
We start with the manual one, after a user is alerted. This is very similar example to the previous one. In this case, we will focus on project information details. Again, we can see the notification that there are certain compliances in the model. We can see the requirements on the left side and the actual data in the model, which we are now correcting. So you can see that although it's very helpful to receive this notification, still it's a manual work to update the model.
So is there a better way to update the models? The answer is yes. We can automatically update models at sync event. And this is the silent update. It may occur without user knowledge about the wrong data in the model and the update that was done in the background.
We saw incorrect data in the model, we are now synchronizing the model. So the difference between this case and the previous one, we are not receiving any notification about incompliance. Rather, there is a transaction that is triggered during the sync event to update the data if it's incorrect. We can now see that data was automatically updated and the model is compliant with requirements.
So this is a very powerful mechanism because if we imagine that, for example, there is some last minute change in maybe title block details, maybe revisions or any other data, we are able to automatically update this data without someone manually checking each model separately.
However, there is one downside of the previous example, and the downside is the need of opening and synchronizing each model individually. And we know that in the large projects, there might be hundreds, sometimes thousands of models, and this is why we would like to batch update multiple models. So this is basically the third way of updating models.
Again, we are simulating some changes in product information requirements. Now we have a lot of things going on the screen. On the top part, we can see three models-- so this is architectural, structural, and mechanical models that will be updated automatically. And in the Revit window, we can see one of the models, and we can see that it contains incorrect data. These three models will be updated.
I'm closing the model only for the sake of demonstrating that this tool can be run from a model that is-- from the model that is open, not from the project that we are trying to update the files. So I just opened some sample test model, and you can see that in the application dialogue we are able to select Project and one or more models that you would like to update. In this case, I selected all models, which are the three files.
Now we can see that each model is opened, it's updated-- it's not really shown on the screen, but it's updated, saved, and most importantly, it's synchronized with central. In addition, we see that there was some extra data exported for each model.
So if you think again about the model health metrics, we don't have to close the models to export the data, we don't have to synchronize the models to get the data, we can use this tool to not only apply the update in the model, but also to extract latest metrics, or, for example, to export Navisworks model, which is a quite common task.
We'll now receive notification that three models have been updated and synchronized with central. I'm going to refresh ACC to show that there was some additional text file containing data extracted.
Now we are opening one of the models to demonstrate that there was change applied automatically. And again, this concept can be used for way more than project information. We can create sheets, we can place the views on sheets, we can have a title block, place families, do some validation, export metrics or models in different format. So this was the third and the last way of updating the models using Revit API.
The last video will demonstrate potential of Forge API.
In first example, we'll talk about Autodesk Construction Cloud and SharePoint project consistency.
Now it's time for some background. It's quite common for organizations to use Autodesk Construction Cloud along other collaboration platforms, such as SharePoint, which normally is used to facilitate various internal processes within the companies, and one of the challenges that companies are facing is to ensure data consistency between two or more platforms. So in this case, we'll look on project data in SharePoint and Autodesk Construction Cloud, and the table shown on the screen shows possible compliant and non-compliant scenarios of entities in Autodesk Construction Cloud and SharePoint.
This is the example of SharePoint projects list. Power Automate cloud flow to extract data from SharePoint. List of ACC projects. We can see that certain products are excluded using built-in parameter, and this is the example of some testing projects or template projects. We don't want this project to appear in any official project list in internal reports within the organization. Custom force integration.
And now I'm seeing very minimalistic Forge application. All the tools presented today will be executed from terminal, but obviously we can build some web application and really work on the user interface to make it more user friendly.
You can see that in a matter of few seconds, we obtain the full results about the non-compliant and compliant projects. We can improve credibility and show it in Power BI dashboard. And based on the results, we are able to identify non-compliant projects and apply corrective actions-- so this might be adding projects to SharePoint or BIM 360, removing archiving, and so on.
So one of the examples, how it was very useful is if the company is controlling project access by forms which are based on some data in SharePoint, the companies can identify if there are any projects in BIM 360 or Autodesk Construction Cloud that cannot be accessed because this project doesn't appear on SharePoint list, or vise versa, they appear in a SharePoint list, but they are not created in BIM 360.
In the second example, we talk about the users management, and we would like to add multiple users to ACC project based on project configuration file.
On the left, you can see product users with specified company and role. On the right side, the actual members of ACC project. So the list on the left, it very often is provided by, for example, by project manager. The manager is receiving long list of users, he's trying to add them manually, so this process can be automated via Forge.
Again, we are opening our Forge application, and in a matter of a few seconds we are able to match up users to BIM 360 and Autodesk Construction Cloud project.
We can now see full compliance between platform and the project users list.
In the next example, we'll talk about task information delivery plan validation, and in simple words, we'll be tracking models and submission progress.
We are now looking at the models and sheets requirements. Very often, there is a separate document on the projects called task information delivery plan, which specifies responsibilities and the contractual deliverables. We are showing this two tables and we'll simulate a sample milestone submission on the project. So all these files are expected to be submitted by the project delivery team.
We can now see created folder in published area in common data environment, we can see multiple files uploaded by the team, and we would like to identify what is the submission status as of this moment.
We are now looking on the Submission Progress Dashboard. That is based on the data extracted directly in Autodesk Construction Cloud. And this dashboard allows us to identify actionable items for the file that has to be either added or deleted in ACC or TIPP.
We are going back to our Forge application. But before that, we are simulating corrective actions. So basically, we are uploading missing documents reviewing TIPP based on the dashboard results. And now we would like to rerun the application to obtain latest data.
We can now see updated data uploaded in BIM 360.
And now we can refresh our BI dashboard.
Now based on the updated dashboard, we are 100% confident of fulfilling milestone deliverables. There is obviously alternate way of achieving similar results. We can extract a file log from ACC, we can copy it to Excel or ransom formulas and obtain similar result. However, this is, I believe, a smarter way and more automated.
The very last example in this presentation is the naming and parameters validation. So we would like to validate parameters and naming compliance for multiple models without opening Revit.
First, I'm going to demonstrate Forge API functionality and obtaining model content properties without opening Revit. We see now on the screen we are browsing to the project content, and you can see a lot of data related to Revit elements inside the model. You can see all the identity data, also our custom parameters that would be verified later on. The sound transmission class and fire rating.
Knowing that we can obtain data outside the Revit environment, we'll now validate two additional models that are uploaded in ACC by SC1 organization. So this might be one of our consultants that is not using this fancy tool that we demonstrated earlier on, but we still want to validate the data compliance for these two models by not necessarily opening these models and running any of our desktop based tools.
I'm just opening these two models just to show that there is indeed non-compliant data, but I'm going to close this file as it's not required to have Revit open. We don't even have to have a Revit license or Revit installed on your PC to run this tool.
We can now see that there are two models being validated. And in a matter of a few seconds, without opening Revit, we are able to validate naming system and parameters compliance.
So this is already good data. We can share this feedback with our sub consultant to inform them about the high level results of compliance status. But in addition, we can also expand the results further to show more details about non-compliant and compliant content, and this data will include elements, IDs, and the results for each parameter or metric that was checked.
And again, this is just the data displayed in visual coterminal window, but this data can be exported to any other text file format, and using dashboards or reports we can easily inform the project team about any compliance issues in the file.
So this concludes my presentation about single source of truth and data consistency using Autodesk Construction Cloud, Revit, and API. I hope you found the content insightful and inspiring. I also encourage you to have a look on the session handout for more information. Thank you for attending this class, and feel free to contact me if you have any questions.