Description
Key Learnings
- Learn how to create workflows using the AEC Data Model API to best use data from a Revit model.
- Discover opportunities in extracting and using data from Civil 3D, Revit, and other civil-architectural applications.
- Learn how to greatly improve the efficiency of Revit model data extraction to allow for further analysis and opportunities.
Speaker
- MHMarie-Aline HunterB. A. Sc. in Civil Engineering from University of Ottawa 27 years of construction experience in California with a focus on 3D technology since 2005. Founded Avixi Inc in 2014 to provide solutions for real estate owners in leveraging 3D models built for their construction projects. Delivering virtual reality experience, carbon calculations, automated workflows and data quality for maintenance and operations of buildings has been my primary focus.
MARIE HUNTER: Welcome to the QA/QC workflows from minutes to seconds with AEC data model. This is class BLD2542, an Avixi use case. My name is Marie Hunter. I'm the president and founder of Avixi. Safe harbor statement. It's got to be there.
So a little bit about me. My name is Marie Hunter. I have a bachelor in civil engineering from the University of Ottawa. I have 27 years of construction experience in California with a focus on 3D technology since 2005. I founded Avixi with in mind leveraging the 3D models that are built for the owners of real estate. And I wanted to make sure that they wanted-- they got the value out of the money that they are spending for these models.
My BIM experience stems from different aspects of working for a general contractor and working as a consultant. I've done modeling, virtual reality, carbon calculations, estimating, trade coordination, development of BIM standards, automations of workflows, data quality and maintenance of operations for buildings. And that's been my primary focus in the last 10 years. I have a process patent, the building information modeling for operational management of facilities from TD Commons. I also speak French and English.
So a little bit about what we're going to talk about today-- I'll talk about my company. But also, what got me to present this today was this use case challenge. The solution of using APS, APIs. A little demo about how the API is used and what we got out of the outcomes and results. And some further workflow and time savings that we've benefited from. The participation also with the Autodesk Beta program. I will walk through what our experience was as well.
So about Avixi-- we believe in the power of BIM technology to foster innovation with the AECO industry. And our focus has been let's make sure that these models do not collect dust. We believe them as part of the building cycle.
We want to foster data accuracy and accessibility to all stakeholders, not just the people who are BIM savvy with Revit experience. Our accomplishments in the last 10 years has been the quality control for facility maintenance data asset of 5 million square feet of office space, 3 million square feet of virtual reality, to help the owners understand what their building will be like before they get built.
We've also developed Revit plugins for the focus of quality control and automation. And just recently, we've developed web applications for the extraction analysis and reporting of asset information and quality control. What we are most proud of is the on-time delivery of our facility management data in tranches in a way that makes it feasible for a operating vendor to create their preventative maintenance program. And we are located in the San Francisco Bay Area.
So our use case challenge was we have a large web browser tech company for our client. And they wanted to understand what it would take to be able to extract data from the Revit models-- this is the granular data of elements and their parameters-- and be able to extract it in a way that the quality control can be done in an automated fashion on a ground-up office building project.
So as you see in this diagram, I've outlined what the trades-- the multiple trades that we often deal with on these projects. And the manual process before was tedious because we have all the stakeholders working off of the owner's ACC account. We have a strict policy of not opening models straight off of the collaborative site. And so our work requires us to download models, which is pretty expensive, depending on the size of the model.
Then that process was followed by opening the Revit models one by one to find the schedules that are of importance to us, the schedules of values, and the parameters that we deem necessary to quality control. Then these schedules need to be exported using different tools, different plugins. And finally, you've got spreadsheets that you need to further analyze using other plugins or different scripts.
So the manual data extraction on large Revit models is often very time consuming. And it requires powerful hardware. It requires also good internet bandwidth.
So the solution to that was to approach Autodesk. Our consultants there and also our representative at Autodesk led us to take a look at the AEC Data Model API. And this would provide an efficient way to extract data without opening the Revit model itself.
The API is really a set of GraphQL APIs all in one. And that keeps track of who, what, and when data changes are made. So this works off of files on Autodesk Docs within ACC. And it allows developers right now to do a read only. And it extends subsets of models through the cloud-based workflows, without the need to write custom plugins and any kind of desktop authoring applications.
And this API works with Civil 3D, with Revit and other civil architectural designing applications. So like I mentioned before, it requires to be on ACC. And at a minimum, you need to be in Revit 2024 or the latest Revit, which in our case, we used for our data extraction.
The other APIs you can leverage along with this data model API are the Viewer SDK, in order to see the 3D elements, and the Webhooks API. We initially developed Webhooks API because during the time that we beta tested, it was a private beta. And the webhooks was not created. But it is there now for people to leverage.
So the initial setup for the API requires to activate the app through the ACC account administrator. And there are really good documents now to get you started. One of them is the API Onboarding and as well as Developer's Guide that will step you through the process of the connections, sending out the queries, the authentication, et cetera.
So to create, first you've got to create an app through the APS Developer Platform. And then, after that, you need to enable both-- not just the AEC Data Model API but also the Data Management API. On the endpoint, the GraphQL queries are sent to a developer site. This is also really well spelled out in Developer's Guide.
You will need to do a three-legged access tokens to authenticate the queries. And then lastly, to execute the queries, you use the GraphQL to create these prefiltered queries that you can then call at a future date, or perhaps using through the webhooks as well.
So the manual and the more automated process using the AEC Data Model API looks like this diagram. So at first, you are having to download the models for the manual process. And then you got to open the Revit models. You got to find the schedules. We use RushForth as plugin pretty easily.
We then export the schedules to these Excel spreadsheets. And then we do some quality control on that. That's a 14-minute per model minimum. So this is given that we have really good internet bandwidth and we have good hardware to handle larger models.
Now, if you use the AEC Data Model API, you're looking more around 3 minutes per model because everything is tied together with not only the API but the development that we've done using already predetermined queries. And the data extraction is then done in a much more seamless fashion. And it is the data that you truly want, nothing else within the Revit model.
The focus, especially with this AEC Data Model, is that you want to look at data. You want to quality control data that you feel is useful to your stakeholders. And the spreadsheets later can be not just spreadsheet, but they can be sent to databases for further analysis and reporting.
So a little demo on the API. I'll first start with some of these screenshots. So this is the Data Model API platform, where first, you will get the hubs. So this is where is the account that you're pulling your Revit models from.
And then you'll get the project. So projects are essentially what is the building project, in this case. And then you get the designs. So designs are what you also call the Revit models. And then beyond that, you get the elements by category. And so this is the hierarchy of how ACC works-- or, sorry, the AEC Data Model API works, is that you first have to choose these four elements, elements by category, for example. And mechanical equipment would be an element category.
And after that, you create your customizable queries. And this is where you can really, truly automate things. You can filter further, perhaps within the mechanical equipment, a category of elements that fits within an OMNI class number type, for example. Or it could be a query through keywords, et cetera.
So I'm going to show a video of the difference between the manual extraction, which is on the top part of the video and the automated version of using the AEC Data Model extraction.
So for the sake of time, we've accelerated the manual extraction video timeline. And you see, at the very top, we first are having to download a model. And that's usually the biggest portion of this exercise is just the downloading, just because of our policies of not opening models directly on the, on ACC, because usually, they are collaborative models. And we don't want to disrupt or potentially damage models that other folks are working on.
So on the bottom part, you'll see that the automated version-- we are choosing our hub and the project. And what we do is we then come up with a list of models to then choose from. And we have to do that in order to then call out for the Revit element categories, which are, for example, your mechanical equipment-- Revit category.
And soon, you'll see then these predefined filter queries. So this is a data set that's just been filtered. You then export it to where you want it to be exported-- for example, an Excel sheet.
And that can be further enhanced to be sent over to a database. So in our tool sets, we've developed this a little bit further where we are versioning data-- as soon as we extract it, it gets a time. And so we can go back and find out if perhaps our teams are doing better on the quality aspect of the data over time.
So on the manual extraction side, I'll go back to that, on the top. So right now, we're exporting schedules using the RushForth plugin. That can take also a lot of time depending on how big your data sets is. So you can see we're running now at about 11 minutes versus 2 minutes. And we're still exporting one by one on AEC DM by filtered queries.
This is a workflow that we've further automated by using webhooks. So as soon as the model is changed on ACC, the webhooks trigger another filter query that then takes this data and versions it. So as you can see, the savings from manual 13 minutes versus the AEC DM of 2 minutes is extraordinary even especially if you are using multiple extractions a day, depending on how fast paced your project is.
So the outcomes and the results of using this tool have been great. We saved 11 minutes per model on average. So you've got projects that sometimes can go to 50, 60 models. You're saving quite a bit of time per week if that's the cadence that you're looking at data at.
So we're getting higher efficiency, obviously. And that depending on, again, your hardware and your internet speed, you could be actually saving much more than 11 minutes. We're just lucky here in the Bay Area to have good internet.
So this process also removes a lot of the risk to open up a collaborative model. So you're no longer having to open that. And you're just reading data without the risk of overwriting any data within Revit.
And so because of this efficiency, it also allows for more frequent rounds of data extraction. And this is something where perhaps earlier on in the process of model development, there could be some quick checks at higher levels that can be done when-- before, we really didn't want to spend maybe the time to do that. This opens up different avenues for quality, accuracy, et cetera.
So the Data Model API can filter data of interest. And this is really important because we all know that Revit models have a lot of objects that come from different sources, often comes with data that no one's really, really looked at. It's just there. It just happens to be there within the family.
And so the Data Model API allows you to really focus on the data that you feel needs to be accurate but without overwhelming yourself with too much data. So the further development from our team has allowed this versioning of data. And this is pretty critical in understanding how well our teams are doing over time. And seeing if there's any red flags or of teams may be falling behind so that we wind up with having a delay in the data delivery for our client.
So for the extended benefits and opportunities, I wanted to look over what beyond the actual AEC Data Model usage for our purposes-- and this is for the development of our quality control tool, beyond the extraction. We're looking at a 2 hour per model savings. And this is pretty significant.
This obviously allows us to be a lot more competitive for that reason. Because we can efficiently perform these tasks, we have more confidence that the accuracy of our digital twin is there. And eventually, this reputation of quality will follow us.
Our clients have this earlier data access. And I think it's going to be more and more important as people look at different ways to use this data over time, especially sustainability is one. And what we've noticed is that because when you start having automated methods to extract data, it promotes standardization, which means that people have this need to standardize the parameter sets that they would utilize on every project or make those improvements in the parameter sets at every project thereafter.
And it allows for the BIM standards to be more narrowed into Revit categories that will enhance that efficiency. For example, the Revit categories is often something that is used interchangeably depending on who the model author is. And so that eventually will drive some standardization in the way that people model.
Other opportunities that we see at Avixi and for the use for clients seeking to analyze this data in different ways-- we see-- bill of material is one that often we get asked for in order for the bidding process to be verified. We've got also carbon calculations. There are different softwares out there, of course, that are useful for carbon calculations. But I see that perhaps as AEC DM being used for the very early onset of the calculations of concrete, for example, or steel or these kind of materials that are very high in carbon density.
Also, digital twin database could be used as a downstream benefit to extracting that model. We want to see perhaps construction progress even being able to be monitored if parameters such as the status of the prefabrication, the delivery, and the installation of assets is also tracked under this Revit model. There's much more applications that only in the future we can think of. And I think that it opens up just so many different ways for this data to be democratized. It is truly a game changer for Avixi.
So participating in an Autodesk Beta program. When we had this challenge brought to us, we wanted to understand what Autodesk had in their tool bag to help us achieve our goals. And we have noted as a partnership that with this client that is an enterprise level-- and so we have access already to much of the knowledge of some of the leaders there.
And right away, we were pointed in the right direction. They listened to us to find out what our problem was. And they suggested different avenues, including this API. At the time that we tested this, this was a beta, a private beta. So we knew going in that documentation is not always there. And that's the reality of early development tools.
But we were pretty impressed with the amount of knowledge and the documentation that was already in place. We understood right away what the API capabilities were and also, as well as, what are going to be the future roadmap capabilities within this API or perhaps a separate API that may complement it?
In the work, roadblocks that we encountered, we had very timely response to suggestions and solutions to get around those. Some of them sometimes you just know they're on the roadmap. And so you take a breather and focus on something else.
The communication between Autodesk team was very open between us. We used the Slack app as well. So that became a great tool. And very impressed as well with how open they were to our suggestions for any capabilities that would be just useful for us but also for what we thought would be useful for the industry. And a few of these suggestions actually came to be quite quickly, including the webhooks. We had developed the webhooks portion of our tool. But the Webhooks API Autodesk developed very fast beyond that.
Participating in an Autodesk Beta program. So the onboarding process was very efficient. The overall introduction to the API was there to provide us some aspects of what the current capabilities were and also the roadmap for additional features to come.
We had a workshop to get started. And this involved understanding how to activate different APIs in order to make our testing seamless. A lot of the back and forth were on one-to-one meetings. We had a weekly meeting to go over what our assessments were and then our concerns. We used the Slack API in order to provide more information on some of the issues that we had. And that worked out really well for us.
On the testing side, during the time that we were testing, we realized that our app possibly could be using this API. And so we actually just started developing our QC tool from it. And from that, I think the Autodesk team got a lot of information about how those tools of developing are used in conjunction with what we were developing. And this was valuable information for both sides.
Because the APIs is software agnostic-- and that means that it's not just targeted towards the use with Revit, which we love to use Avixi, but it is used in a way that the structure may not be so familiar with some of us who have used plugins that usually tie an element ID, one element ID, to a 3D object that provides you a set of parameters.
It's really important. And it was kind of a learning curve for us to understand how the Data Model API structured data so we can combine and make sense out of it in order to have it part of a larger database that's being versioned. And so for us, that was the learning curve. But we got through it. And it works wonderfully for us now.
So I'd like to thank everyone for listening in. And have a great rest of your AU. Thank you.