Description
Key Learnings
- Discover BIM deliverables and deliverable trends on large projects
- Learn multiple ways to create a QA/QC plan for BIM models
- Get acquainted with multiple tools to help automate review of BIM deliverables
- Learn how to develop your own BIM QA/QC plan for your organization, from designers to owners
Speakers
- BDBrendan DillonBrendan Dillon is the Manager of the Digital Facilities & Infrastructure Program for Denver International Airport. DEN is the sixth busiest airport in the United States and has developed a comprehensive BIM and Asset Management plan unsurpassed by any airport in the country. DEN’s DFI program manages over 120 projects at a time with a net value in excess of $2 billion. Prior to joining DEN, he had managed over $1B in BIM projects, including as the BIM standards coordinator for the design team on Denver International Airport’s South Terminal Redevelopment Project. Brendan is also the founder of the annual Airport Information Integration and Innovation (AI3) forum and is the founder of Red5ive Consulting, specializing in BIM deployment and integration for airports. Along with managing DEN’s Digital Facilities & Infrastructure program, Brendan still enjoys getting into the weeds with Revit, writing scripts in Dynamo and generally getting his hands dirty.
- EKEddy KrygielAs a Major Projects Development Executive within the AEC Solutions team at Autodesk, Eddy focuses on BIM and technology workflows for Architectural, Engineering, and Construction clients. He works with large project teams and owners to help leverage end-to-end technology solutions to optimize design, construction, and facility management outcomes. Eddy has almost twenty years of experience in architectural offices and on a range of projects from single family residential to office, federal, civic, and aviation clients.
- MHMark HughesI am a licensed Architect and an airport design and construction digital delivery expert and holds Masters Degrees in both architecture and civil engineering. A subject matter expert in the Global Aviation Business Line, I am fully conversant in planning, design and construction of buildings and civil infrastructure and have been a leader in the practical application of the digital delivery process in the Design, Construction, and Facilities Management industries. I am knowledgeable of the use of BIM by owners, designers and contractors and have guided clients in setting expectations and integrating Digital Delivery Processes for comprehensive program development resulting in integrated maintenance and management activities.
BRENDAN HILLON: All right. So, first of all, thank you all for coming to our session, putting the QA/QC into BIM, especially at this time, this late in the conference. Myself, personally, I'm usually gone by now. So I really appreciate you all coming out and this is really an impressive showing for this late in the week. All right so introductions. First of all I'm Brendan Dillon, I'm the digital facilities and infrastructure program manager at Denver International Airport. With me I have Eddie Krygiel with Autodesk. He does something global. Chris Pittman, the Denver BIM manager at Jacobs. And Mark Hughes, BIM manager and all around global aviation guy for AECOM.
As a quick rundown, so what we're going to do today is we're going to talk about some of the QA/QC tools that we use at the airport. It wasn't actually intended to be all at the airport, but that was kind of how it shook out. This will be model review type stuff, QA during construction. So if you thought this was going to be something else, this is the time to slip out. I promise not to mock you. And this is just more details on that stuff.
So first thing to keep in mind is how do you do QA/QC if you don't have standards, right? That was actually a problem we've had at the airport in the past where our standards were a little challenging, to say the least. So performing QA/QA that content was a little difficult. But we've got our standards in pretty good shape right now. Aside from our own standards, we work at the number of national and international standards. The BIM form stuff is really valuable. The past BS1192 is continually or has set a high bar in continually attempting to raise it, which is, actually from my perspective, pretty exciting.
So without standards, where are you? How can you check, right? One of the other documents that we use a lot is our level of development matrix. I know a couple of people in the room will recognize this particular document because they've used it here at the airport. And this is the Lod Matrix that we used somewhere in the area of eight years ago. Pretty basic, right. We found that, along with our standards evolving, this document evolving has been really important for us as well.
So after using that version for a couple of years, we added some additional information as far as what we're looking for asset management because that's really what our program is about, is informing the asset management program for lifecycle management of the facilities. That's what BIM is supposed to be about, right? So we add some of that information. Firstly, the ability to indicate whether something was actually in the project. Weirdly enough, we don't have a whole lot of dumb waiters at the airport.
Then, does a particular breakdown have any actual assets in it? Some parts or some of the sub-element types in the [? format ?] too, we don't have assets in, so we're looking for different levels of information on those. And then [? do we need ?] commissioning for elements in that category? And this served as well. But we start saying this could be better. So version 2019 which will be coming out in 2019 is going to go another step further. And what we're going to have there more detail on what we're looking for with model requirements.
So what sort of facilities object type do we expect with a particular category? Are we looking for structural walls? Are we looking for a regular walls or architectural walls? Are we looking for mechanical equipment? Most of this stuff is actually pretty straightforward and clear to most consultants-- to good ones. You will find some consultants that say, we're just going to make everything generic models or we're going to make everything specialty equipment, because who knows why?
And that's really a problem for us When we start dealing with connecting that to those models to our maintenance systems because the type of family that it is, the type of object type is, can affect that happening. So we decided we're just not going to mess around with that anymore. We're just going to dictate it because that's what standards do. Then on the civil side, what we expect is a civil layer for elements in that particular category.
That's really important for us because the FAA has pretty strict specific guidelines and is strict and very specific as those guidelines are, we still have some trouble getting all of our consultants to follow them as well as we'd like because frankly the FAA language is strict, specific, and not always entirely clear. So this is an effort to make that much clearer. Following on top of that, or following after that, is the civil object type. We're looking for an [? AEC ?] surface are we looking for a corridor, we're looking for a poly line.
And then do we have to attach object data tables to it to meet FAA guidelines? And then further down, and we haven't started populating these yet, is what are the asset types that you are going to find within that classification? Now you'll notice we're still working with uniformat too. We're in process of transitioning to uniformat 2010. It's just a bit of a beast for us because it's baked into a number of our other processes at the airport.
So enough about standards. Mark Hughes, do you want to take over the QA/QC plan?
MARK HUGHES: Sure. First, I'd like to ask, if I talk too loud wake you up I'm sorry. Second, how many owners do we have. Owner groups? Hi. Good. Design consultants? We've got the room. Contractors? I think the design consultants got the room here. But we're actually going to look at this from the owner's perspective. But it's going to have some meaning to our design consultants and our contractors as we're going through. And what we're going to talk about is the fact that we're dealing with a lot of data now these days.
And it's not just, how do I QC my plans? Or how do I QA that set of QC plans that come to me? It's about data. And it's about the information that's put into the models. And you have to understand that the more and more information we generate, or the more-- I should say, the more and more data we generate, we have to be careful what that data is purveying as far as information. In the next year, we're probably going to exceed the point where we have generated more data than we've ever done in time. You can see here, 90% of the data in the world was generated in the last two years. I bet this year, we're probably going to exceed that.
The problem with it is that IDC, there is a global market research group out of the UK, they looked at all the data that they were given, 90% of it is unstructured. So we're doing a lot of stuff. How do you quality control all that stuff if it's unorganized? How do you then do a quality assurance plan to check that quality control to make sure that you're getting that information done in a model environment? All good questions.
You've got to create your plan. And you have to, first of all, understand there's a difference between quality control and quality assurance. Quality control is a linear path with a finite resolution. I'm going to make sure I meet this requirement. Quality assurance checks that you've gone through that process, and it does it over and over again. Make sure that you've met the requirements that you've established for quality control. A lot of people get those mixed up. Big issue. No QC, no QA. Doesn't happen without the other. They are mutually dependent.
How do you put a QA plan, or a QC plan together on a data structure that is already recognized as 90% unstructured? It's a very tough ask. Look to the standards that are out there. If you've done a good job creating a building execution plan or BIM project execution plan, or for our friends in [? EMEA, ?] the employer's information requirements. Looking at those, in detail, along with site design standards specifications, give you a picture as to what your roadmap should be. Then you need to figure out how do I quantify or qualify those requirements?
And you have to understand that sometimes, looking at numbers, looking at time, are those things that you can measure in a model? Quality of design is different from quality of content. A lot of designers that I've talked with and worked with are concerned that we're going to be looking at the design intent as a measure of quality, and that is absolutely not the case. There's a complete separation. We're looking at content and data. And we're setting up objective quality control and quality assurance based on that.
Knowing there are a lot of things out there, what do I do? What am I counting? What am I looking at? What am I measuring? How do I measure the value or the context of a 3D object? You need to really understand what you're looking for. Are there parameters of that 3D object that need to be measured? Do I set a standard that says I want to make sure that these five parameters are in every family member that is in here of a certain asset type? Those are things that you can measure. And then once you've measured that, you can do that in a model environment really easy.
But then you have to ask yourself, OK, we're still in the paper environment. We're still slowly transitioning to a PDF deliverable environment. How does that paper content reflect what's in the model, and vise versa? You got to put some QA measures QC measures on those things. How do you do that? That's a big question. We've gone through and looked at the old graphic standards and realized that as much as there's value in trying to regularize those graphic standards, it really comes down to, is the message conveyed? Is the tagging appropriate? Do I understand what that piece of equipment is based on the information that's in front of me? Start looking at those measures. Line types, hatchet pattern, things like that aren't as important in the digital environment as they were in a paper environment. We're transitioning from those graphic standards to digital standards right now.
From that, think about all those things in a digital environment. Think about model QC process. Think about the model QA process. And we're going to talk about some tools that we're using at Denver and at Autodesk is rolling out to help us get to answer those questions on that plan that we're trying to develop.
BRENDAN HILLON: So the first of the QA/QC tools we're going to talk about is really not so much an individual tool as it is a platform. Dynamo, everybody loves Dynamo, right?
MARK HUGHES: Nobody?
BRENDAN HILLON: I expected that from Eddie, I didn't expect it from you. I love Dynamo. I mean, they do cool things with design-- really, really cool. I have no interest in it. Well, that's not true. I do have interest in it. I don't have a use for it. What I do have a use for is the ability to check data. So for those of you that don't know what Dynamo is, this is what Autodesk says Dynamo is. And really, what it is it's a programming platform that pretty much anybody can use.
The last time I did programming was some HTML back in the late 90s. Now it's not entirely true, it was probably early 2000s. But I was able to pick this up with almost minimal effort, really. So we use a number of scripts or graphs, depending on your particular Dynamo lingo, for checking our models and for automating processes at the airport. The one we're going to talk about is this one right here, the Model Checker.
And this Model Checker is really a group of different scripts all pulled together in one place, run simultaneously, to smooth [? out ?] process. And some of these are as simple as this, extract the file name. That's about as basic as it gets, right? Others get a little bit more detailed. This particular script, what it does is it goes through and pulls out the name of all of the families of the categories that we have identified here, pulls them all out, organizes them, puts it in a format we can consume easily and quickly. When you put all of these scripts together, what you get is this.
This checks 18 different line items out of our model review form. Our model review form has somewhere in the area of 48 line items that we check for every single module review that we do. And when I say model review, I mean every single model, not just the individual submittals. So we pulled this together, in part, because I was desperate. I hit a point where, normally, I have two facilities BIM coordinators working under me. One of them got hired away into another section. The other one, I helped hire away into another section. And aside from doing my normal job of running the group, I was then doing all of the model reviews myself.
I had no time to do the stuff that had to be done. So, out of desperation, I pulled this together. And over the course of about a weekend, really-- and some of the stuff in here was stuff that we were already doing in different scripts, and others were ones that I pulled together myself off of really knitting together tips and stuff that I found on the Dynamo BIM website. So, as I said, this does 18 checks. It takes two minutes to run. And I don't mean it takes two minutes for the model check itself to run. I mean, the entire process takes two minutes.
And so the version that's running here, this is running on our biggest, ugliest, nastiest model that we have. It doesn't get any worse than this. This is 760 megabytes. And really some challenging modeling, I know one or two people in this room that they could probably guess what project that's from. This one tool saves us three hours per model review. It checks simple things like the project name. It checks line to styles, dimensions styles, line types.
It looks for generic families, in place families-- I hate in place families. It pulls out the names of all the families. Then, once it finishes running, what you'll see is-- so run completes, I open up this spreadsheet here. And what this spreadsheet is going to do is it's going to look at another spreadsheet, compare what's in that spreadsheet to the list of standards we have in this one, and identify the deviations. And then I can copy all of that right over to my model review form.
And I've got my 18 items checked. We have a kit of parts of over 2,400 families. This checks all of those families against what's in our model in under two minutes. This saves us a ton of time. We had 300 model reviews this year. The payback on this was like, a day.
There we go. One of the other tools that we've started using recently is BIMHelper tools. And this is a pretty cheap little tool up on Autodesk's website. You can download it, it's some independent developer that just had an idea and pulled some stuff together. And we use this to check the families that are submitted to us for use in projects that aren't from our kit of parts. Unsurprisingly, 2,400 families is not enough to cover all the projects we will ever do.
So when new stuff comes through, we use this tool to check against all of our requirements. Are there room calculation points used? What version is it saved in? We send this information back to the consultant. We say, fix it. And then they know exactly what it is they need to fix. Where's the value? This saves, on average, 36 six minutes per model review. Not family, but batch of families for models submittal. Again, we've got 300 of those this year.
Saved us 180 hours. The payback was less than two weeks. And you're going to see that some of these tools, a lot of these tools really, overlap a lot. But they do things in different ways. So one tool might catch something because of how it functions that another tool maybe it might have missed. What I like about Dynamo, specifically, is the flexibility to it.
Anything I can come up with, anything anybody on the Dynamo website is willing to coach me through, I can pretty much do. The next tool, I think, is the Autodesk-- that's the old version. I didn't save it. Sorry. Autodesk Model Checker, it's actually the models specification. Yes.
CHRIS PITTMAN: OK, so I'm going to come at the slightly from the design professional's point of view, because I'm an architect. I'm a BIM manager, so we're working on a project at the airport now that's four parts. Half of it is with Jacobs, half of it is with [? HNTV. ?] So we're going to be expanding the airport I think 35%, or something like that.
BRENDAN HILLON: So Denver International Airport has 111 gates currently. We're adding 39. As a point of reference, LaGuardia has 40.
CHRIS PITTMAN: So it's-- concourse C about 16 gates, concourse B, all said and done, will be seven brand new gates. So what we're dealing with in terms of users right now is 150 plus users within the BIM 360 platform. Between the two projects, concourse B and concourse C, there's 20 plus models. I say plus because that might change daily. I'm not telling you that.
We're just issuing 30% on concourse C this afternoon, and we're already at a million plus elements just in that model. Total 650,000 square feet, which is always subject to change. And then six total phases from early enabling packages all the way up to finishes and renovation of the existing concourse. And within that, I feel like that number is low, we have potential for 500 plus assets. So the assets are all the stuff that Brendan was talking about earlier in terms of the maintainable devices and equipment within the airport that they need to track over the course of its lifetime, from the time we draw it in the building, to the time it's actually installed into the building turned on.
So the project got awarded and we were, for lack of a better word, I was a little panicked how we were going to track all these over the course of multiple years. And it was kind of nice at the time, Eddie, I think, called up and introduced me to Marcus and they had this tool that basically plugged into [? Navas ?] [? works ?] to allow us, for lack of a better word, to check the stuff automatically. To check to make sure the design team is actually following the DSM from Denver, and that we're giving them a product that we can be proud of, and that actually works for them and we don't have to come back and constantly redo it.
So our old manual way was, there was scheduled setup within each Revit model. And each discipline has their own Revit models. And we would check this schedule against all the asset types and all the parameter data that Denver wanted us to fill out. It's a really manual, cumbersome, and error prone because we're counting on the human and who's burnt out from working long shifts and putting all this stuff in, to actually check this box. So if they don't check that box, it doesn't show up on the schedule, and then it potentially gets missed. And it shows up in one of his reports. So I don't want to do that because it's extra work.
So luckily, this tool came. And I'm going to summarize it the best way I can. What it's doing is checking all of these assets against the uniformat classification. So if these assets show up in that uniformat that section they in theory should have all these parameters filled out with them. So we have a example here. It's a seasoned search sets picture of the model. And it's checking to find any of these elements within the model, and then the pass rate of those elements.
So, for example, here we have-- I think it's a panel or a transformer, and it's checking to make sure that, did we fill out the assets? Did someone check the box for yes or no? It's an asset. They didn't do it, so they failed that one. The asset function area, which is a parameter that Brendan is asking us for, is it filled out? And it's filled out correctly, in this case, so they pass that. The asset location, it doesn't have a asset location filled out, so they failed that.
And then the asset status, for us it's designed up until it goes into the building, and then the contractor takes it over. So we passed that. And then the asset type, they didn't get right or the mark. So it's giving each one of these different assets a pass/fail rate, and it allows us to quickly check the model and then go back and fix it with the element ID that we have right there. And it's actually going to turn it bright red within the [INAUDIBLE]. So if you're more graphically inclined like myself, you can quickly see where the problem is.
And then a nice bonus that Marcus pointed out, and I absolutely love this part. So we have, I don't know how many rooms in the model right now, 300 plus, and each one needs to follow a specific naming convention per the Denver airport. It can become kind of cumbersome, especially with a lot of people in the model copying rooms and placing them. And they don't go back and fix it. So if once a week, I can go in here run this report. And I quickly see what rooms and what spaces actually are named correctly.
Then I can quickly go back and fix them too. So all in all, I can't count how many hours this is probably saved us. We run it on a weekly basis, along with our clash reports. It's all using [? regex, ?] which I'm trying to teach myself. It's kind of difficult. But we can even add onto this if we need to. If we add more asset type stuff, we can add on what it's checking. So it's very flexible and very functional for us.
BRENDAN HILLON: So to add onto the time saved, the time that saving Chris, is also time it's saving me. A model that is poor takes probably four times as long to check and document the issues as a model that's actually meeting our standards. So having content come in that, even if it's not perfect, if it's significantly better than poor, it's still saving me a ton of time.
CHRIS PITTMAN: Thank you.
EDDIE KRYGIEL: All right. So at Autodesk, we also understand that not everybody wants to learn or can learn-- I am not capable of learning Dynamo. I've tried, it it doesn't happen English is still my first language, it might also be my second. It is not making it to the third. So in the things I'm going to show you, we have recently released an update to our BIM interoperability tool.
So if you're familiar with biminteroperabilitytools.com, that's a website that is hosting and owned by Autodesk. And we add on tools there for Revit. Because we just released the beta, if you would like information about it, or some of the things I'm about to show you, grab that QR code. It will send me an email. I will be happy to forward you the templates and dashboards I'm about to present.
This will be posted online. So once you see this slide, after this slide is gone, that's it. And it goes away forever. So what the tool is is a two part tool. So one is the Model Checker for Revit. So this is an app, like I said, an add on for Revit. It sits on top of a Revit. And it allows you to check the model tool. So if any of you are familiar with the other tool called model review, that's inside the platform right now.
Very similar feature set, except this has some additional values to it. So some of the things that we added to it are not only the ability to check against a set of standards, like client standards that we have, in this particular tool we've baked in [? Kobe ?] standards [? USAS, ?] the state of Tennessee. You can see some of the less PSU standards down there along the bottom. Denver standards will be added to those here in the upcoming future in the next few weeks.
BRENDAN HILLON: Will be?
EDDIE KRYGIEL: Happy to send them to you. They're not in it just yet. But they will be. You can also export all of this to Excel. And that's really where the value stream in this particular tool comes from. Because once you gather the data in the previous versions of a lot of the other model checkers, it exports all of the information to HTML. So you have this nice long 150 page website that you can pan through and look at all the things you didn't do right, which is helpful.
But having a lot of data isn't useful if you can't read the data or consume the data. So if I bake it into an Excel format, I can consume that in a dashboard, like Tableau or [? PowerBI. ?] So this will run the check. And then there's a tool that will actually do the configuration. So in this particular tool, you're looking at the Dashboard. It sits on its own tab within the Revit product.
And there's a classification manager. You can basically go, through using regular expressions, and set a series of functionalities or calls. We have a number of common ones that are baked in. So if you want to run a report on the 20 largest families in your model, or if you want to check to see if there's imported CAD content, you can do all of those things without having to do any programming.
So the idea is to make this more consumable. One of the features that we are also adding to the tool is the ability to check against best practices. So common best practices within the Revit platform. Two examples-- anybody here import CAD as opposed to link? Thank god. You guys are the best.
BRENDAN HILLON: I had a bat so that I can hit them over the head.
EDDIE KRYGIEL: I did this one time and literally everybody raised their hand and say import is better than link. It's not.
BRENDAN HILLON: These are same people to do in place families, right?
EDDIE KRYGIEL: Likely. So we can check against things like that. The other one that we tend to get a lot is on model size. Anybody here believe that model size is a true determining factor of performance? OK, if I've shamed you into not raising your hand, I want to point out that while that might partially be true, you can actually have a very small model that performs poorly because you've imported an exploded CAD as one example. So a small model can perform worse than a large model, depending on how well the model is built. And what we would like to help Revit users do is understand which are the triggers for better performance and worst performance and help you be able to tune that model so you don't have to wait 20 minutes for the model to open.
BRENDAN HILLON: Does Autodesk have a model size recommendation?
EDDIE KRYGIEL: No. So where he is leading is that on the website in 2008, we posted a model size recommendation with 250 megs. And then promptly forgot about it. It sat there for 12 years-- sat there for 10 years. And we recently just removed that. So we do not have a model size recommendation.
But we do recommend is using best practices. So in this configurator, tour you can configure all of the settings. And I had noted there's a bunch of preset configurations. So how all of this works then, it exports to an Excel file. And then we are working on creating a series of PowerBI dashboards. So PowerBI is for a single user is a free tool.
You can download it off the internet of a Microsoft website. It installs windows PC environment. This is page one of the Dashboard. So we're checking for really basic component information. And in those six boxes in the center, we've added a little bit of regular expression to help you understand what we're looking for as a best practice. So this 04106_WyandotteLofts model, this was the sample file that was put on the Revit 2008 install CD. Not DVD, but CD.
So we grabbed that and upgraded it to 2019 and ran it against what we consider best practices. So with all of those little I icons, the six I icons, you can select any one of those. It will pop up a series of notes to tell you some of the specs and requirements that we have suggested that you use as a best practice. So what are those? In simple terms, 500 errors and warnings.
That's not a hard number. It really doesn't impact performance if you have 501. But psychologically, we found that if you put more than 500 errors and warnings on a model, people don't change them. They'll go to 1,000, 2,000, 6,000, but somehow less than 500 seems to be psychologically manageable. Imported content, please don't import link. You sound like you all know this, that's a good thing. I thank you for that. But we read for imported content.
A percentage of views, not on sheets. So while you might be clipping more views, those do impact performance after a while. And in another one of those weird psychological moments, if that number seems to exceed 20%, people stop cleaning up views. So we have a way for you to check against that. [? Purge ?] [? on ?] [? use, ?] it's kind of the same thing within a 15% range is what we do.
But we base that on element count. So the idea being that, if your file is large, or your project team is large, you will have more elements in the model. You'll have more people working in it. The project will go from SD to DD to CD. Element count will increase. So it's a percentage of the overall count being a level of complexity of the model. In place families, very similar. Except we're basing this one on actual file size with the idea that there are very good reasons for some in place families.
BRENDAN HILLON: No.
EDDIE KRYGIEL: Depends on your standards and your particular owner. But we have that tool in there for a reason. And then families over five megs. We tend to find that families over five megs are things that tend to have imported 3D [? DWG ?] content. They're overly detailed. They have too many components in them. If you are an advanced family editor and you know what you're doing, you can disregard that. You can change it to anything you want. But we really recommend you try to keep the number of your family size under five makes.
BRENDAN HILLON: Furniture.
EDDIE KRYGIEL: Furniture is one. [INAUDIBLE] Your file size goes through the roof really fast. So with that, you can check against those things. So we looked at some of that data right here. On a 600 meg model, this takes around eight minutes to run. So it will run it, it'll export it. It's about a 30 second process to pull that file or update that file into this PowerBI a dashboard.
And then on that first tab we go over the general requirements. This is specifically around work sets and work sharing. So one of the things we also see quite a bit is-- obviously we've got important content listed there on the left-- works at one. People either don't rename it or they put all of the assets on it. Why have a work site called core if there's only eight objects on that work set?
I don't know. But it's another way for you to check against whether or not people are using your standards or using your best practices within your office in the preferred way. We have a list of the design options and the number of elements on each design option at the upper right. And then generic models, which is a big issue for some people. To have generic models, you can hunt them down by element ID and either put them in the right classification, or just get rid of them altogether.
From there, we move in to families. So we've talked about families over five megs. So if you're looking at the list on the left, is the 20 largest families within the model. If they are over five megs, they will flag red. I've personally seen families as large as 40 or 45 megs. I've seen models smaller than 45 megs. But we do get an overall family count in the center, and the number of in-place families there at the bottom. And then each-- if anybody is not familiar with PowerBI, anything in PowerBI is actually selectable and filterable.
So if I select any of the colors on those wheels in the center as an example, it will filter the content down and show me what falls into that specific category. So I can choose structural elements down there at the bottom, it will show me that I have 81 families that are under one meg, and then list all of those families, or any of those in-place families, over there on the right hand side. So lot of value in it. And it takes that big data stream and makes it consumable by any end user. So the goal being that you could require your project team if you are a BIM manager, to run this report on a weekly basis. They take 10 minutes of time to run it.
They upload it to PowerBI and then they can share that dashboard with you as a BIM manager. You don't have to be reactive, you don't have to wait for the model to break to fix something. You can have them check it on a regular basis and make sure it's running efficiently. The story I will give you about that is, we worked with a firm when this whole issue came up. They had a large health care project. They had 40 people working in a single model. The model was around 900 megs.
It was taking 23 minutes to open or save. So 40 people times 23 minutes a day, open save. We were basically using an entire week's worth of time every day just to open model once, to save it at lunch, and save it and close it at the end of day. That's 40 hours worth of work every day just in opening and saving. By following best practices, we got that model down to performing-- file size was lowered to about 600 megs. And it was opening in 110 seconds, so just under two minutes. Huge time savings, right? It took us about six weeks of cleaning to get the model there. We got it there, everybody high fives, great job. Pat you on the back. We'll check back with the team three weeks later. All of the content that was bad that we took out, somebody had gone in over a week and put it all back in.
BRENDAN HILLON: Was it furniture?
EDDIE KRYGIEL: Some of it was. So just like your car, you wouldn't change the oil in your car one time and walk away. You don't wait for your engine to fall out to replace it. Ideally, what you're doing is regular maintenance on the thing in small increments to make sure that your performance stays optimal. So that is the goal of what we're trying to do. Last tab here.
Views not on sheets, you can see the list of views that you have, the views not on sheets, the number of model groups-- that will also show detail groups, which this particular model doesn't have any-- the number of errors and warnings you have, and then a list in count of what all of those things were. So you can decide what your priorities are, what needs to get cleaned up, and how best to address those problems. Mark?
MARK HUGHES: So we talked a lot about the impact for quality control, Chris's side quality assurance with Brendan on project, and a tool that can probably be used for both with Eddie's Autodesk stuff. And that's great for design. But assuming we've done all that, we've got our content correct, and then we've produced our documents, we'd have to have somebody build it.
And contractor takes that information and consumes it. And the owner, in this m at Denver airport, takes that information. And we distribute it out to quality assurance inspectors. We've got anywhere, depending on the time of year with airfield projects running and facilities projects running, between 60 and 90 field inspectors that need to have access to the latest approved content from the design team and from contractors RFIs to contract quality control plans and jurisdiction reports, things like that.
So we're finding that it was difficult to perform that QA process by hauling around plans and specifications and code books and pieces of paper to write on. So at Denver, we started about three years ago after our hotel project. We saw how successful the contractors were in utilizing BIM 360 Field. We started with a couple pilot projects. Three years ago, we started with one project. And a month later, some of the inspectors saw and said, hey, I want to be on that too. So we went to three projects.
By the end of the year we're at 60 projects. Currently, we're at 194 activated projects that we're managing and monitoring on a daily basis we've BIM 360 Field. For all of our projects on the concourse expansions, Great Hall, Pena Boulevard, airfield design and construction projects, [? Lan ?] [? side ?] projects, running 60 to 90 inspectors, all with field access.
And they won't go back to paper, which is a good thing. But we've taken those quality controlled documents, and utilize them to push them out to our field inspectors so that they have a better process to work with. This is the other side for your contractors that are working at it. We're also inviting all of the designers into the process. And we have a mandate from the city and county of Denver. Their city auditors are invited to every single project. Why?
We proved a process that works. They want to audit that process and use that as a check against us. FAA is requiring a BIM process. We're utilizing these tools for all of our airfield projects to provide reporting to them on a weekly basis. Typically, in the past, it would take us months and months and months to compile documentation to submit to FAA for final approval of project and payment. We're doing that on a weekly basis now. Our quality assurance inspectors are getting validation of material placed, finishes, all the work is done. We generate the report. It's reviewed by our Senior Chief Inspector. It goes off to FAA week.
Having all that information is great. Having a lot of data that doesn't mean anything is not good for us. So we really streamlined and optimized that process so that we're really focusing on information that can be captured. We really limited our access to our field inspectors because their focus is to do quality assurance on approved documentation and report deficiencies and issues and noncompliance reports based on that. That is the requirement of those people. We are not there to make judgment calls.
It's very specific, black and white reporting. This tool allows us to do that based on the quality control documentation that we get from the design teams. Other things we're doing-- we've deployed iPads. I currently have about 120 iPads floating around the airfield and the facilities that we manage on a daily basis. Pretty straightforward once you get it.
We set them up all the same. They're totally open. If someone were to take them, we can shut them down pretty quick. But it gives access to all of our projects. If I have an inspector and 10 projects, he's got 10 projects on his iPad. Depending on his login, he may have 15 projects. Personally, I have all 194. It's really fun to synchronize all those.
What do we do with them? We capture a lot of field information that we then share back to our design teams through our model reconciliation process. We'll work with a contractor, contractor is installing the equipment. He's got a piece of equipment. He knows what the make, model, serial number, manufacturer is. Our QA inspectors go in and validate that they've captured that information properly. And we send that information back, and it gets linked into the models and the data is updated.
Then our asset management team can take that information and consume it. We're doing the QA on that QC that contractor is capturing in the field using these tools. Another the thing we're doing is, when we're out in the field, whether it's a new install, whether it's an existing asset, we're using these tools to capture that same information.
That's Stephanie and Tim from Denver airport. They go out there on the asset management team and they do ad hoc inspections of work in place as well. So we have a QA on the QA. Our inspectors check it, and they come back and they also check to see that assets aren't missed because we have certain rules for the inspectors. But then there's changes that happen.
We update our asset inventory all the time. So this team goes out and does that. And we're able to use these tools, again, efficiently. We're talking 12, 18 parameters at the most, on a totally empty asset.
BRENDAN HILLON: 18.
MARK HUGHES: 18? Pretty straightforward. What are the efficiencies that we gained on this? And I have to say it's measured at efficiency. It's not really cost savings until you look at a bigger picture. In one year, we only surveyed our 41 inspectors at the time. But everybody can stop reading. There's the big number. That's around 41 inspectors in one year-- $650,000. We're running, and we're anticipating running this summer, upwards of 96 inspectors.
BRENDAN HILLON: Possibly more.
MARK HUGHES: And we're probably going to push that project number up to about 220 projects. So this is real savings for us. This is just on time inefficiency. Instead of having an inspector who is on six projects, at the end of day, transfer his notes from his paper to the computer, generate the report, and submit it, he's doing it on the fly.
Yes, we have had some experience with some latency in reporting due to audio problems. The airfield guys sometimes, when they're standing next to the jet trying to dictate, it doesn't work very well. We got over that. We taught them how to use these things. And they can type. And the manpower savings is one thing. The other thing is this. We asked our inspectors after a year of having iPads, said, how many of you want paper and pencil and specifications and plans?
And knowing the size of the projects that they have, we don't have vehicles big enough for them to carry these plans specs around. We're looking at over 1,100 pounds per plan set. It's a stack about this tall. Building department doesn't even want them anymore. They want electronics. Imagine all the revisions. How many packages? Six packages. How many revisions? Who knows. Every set of those is another 1,100 pounds they've got to carry around, and the specifications as well.
BRENDAN HILLON: Assuming they get the right ones slip sheeted.
MARK HUGHES: Yes, assuming they're slip sheeted properly. Reproduction costs, it's probably pretty cheap for that amount of paper. But you look at the fact that we're deploying iPads at 1,200 bucks a pop. And on that, he's got eight projects for every inspector and saving eight times that amount of money. And he's got a survey vest, he just throws his iPad in, he walks around, he drives his golf cart around. He doesn't have to worry about carrying all that paper around. That's a big value for the airport.
BRENDAN HILLON: All right, that's pretty much what we've got. So a couple of additional notes. One, the Dynamo script that we've got set up is available for download, along with the PowerPoint, which will be up shortly. So there's the script. There's the two spreadsheets that do the actual check. And then a copy of our model review so you can see how the whole process works out for yourself. And I guess with that, we're going to open it up to any questions. Yes.
MARK HUGHES: How many bipeds did we lose? OK, so I can answer this. On the hotel project, and I'm going too point at Josh because he was there. You guys deployed what, 60? We lost two of them. Two were lost. I've deployed 120 in the last two years. I have three cracked that are sitting on my shelf right now.
BRENDAN HILLON: I was going to say, some of those iPads from the hotel are still functional and in use.
MARK HUGHES: Yeah, from four or five years ago.
BRENDAN HILLON: Six years ago,
MARK HUGHES: But even at that, how much does it cost to print a full set of plans. I had one guy, he had it in the back of his vest. He slipped on the ice and landed on it. Saved his butt, but broke the iPod. I don't care about the iPad. It's $1,200, we'll replace that. I can't replace a valuable field inspector for that. I mean, he came in, he was all upset. Gave him a new one. Go, get back to work.
BRENDAN HILLON: I wonder what the workman's comp saving on that was.
MARK HUGHES: Don't even want to know.
BRENDAN HILLON: I do. So the question was, are we using the models for pay app verification? I would love to. If I went that route right this moment, I would probably not get much traction. But it is on our roadmap to work with our project managers to establish that as a guideline.
MARK HUGHES: I think that the threshold there is the project managers. They're in charge of project controls and funding and we don't really have a say in that.
BRENDAN HILLON: Next question. So the question is, are we using the models for operations and maintenance? So what we use the models for-- there's really a number of things. So first, we connect them to our Maximo maintenance management system and we populate assets to that system directly from the models. That's number one-- automatically. Yeah there's always QA and QC see going on. Always QA/QC.
So we populate Maximo with it. We also use those models for informing the next project I was in a presentation this morning where someone was talking about that process, about the value that a model has to the next project that's going to happen that area. And she said, oh yeah, and 50 years down the line. I was like, six months down the line. The first project to happen in the hotel entertainment center after it was finished started less than six months later.
And it wasn't because anything was wrong, it was because that was when it was time to do the next project. So yeah, we are constantly working with that information to inform the next project. It also gets a lot of use from other stakeholders at the airport. It populates our GIS program. It gets used by Denver Fire and Denver Police on occasion when they have a need for an additional layer of information on top of what it is they already have. There aren't many stakeholders at the airport that don't get use out of our models.
MARK HUGHES: There is also a condition parameter built into the models for the asset. So based on the condition level, you can do a visual check. Red is bad, yellow is failing, green is good. So you they can utilize that for quick visual assessment of equipment status.
BRENDAN HILLON: So you're asking about a digital dashboard referencing the model within Maximo? I'm not the best Maximo person. I don't think, so but that doesn't necessarily mean that there isn't. So the question is, have we started taking those savings that we're seeing and using that to justify a smaller budget for printing and things like that downstream? At this point, no. We should be there. But again, it's working with the project managers and the contract.
MARK HUGHES: So from the QA side, it sure justifies our budget for being there. My inspectors cost something every day to have them there. If we can show it we're saving money, even through a process like that, there's value added to the owner.
BRENDAN HILLON: Yes.
MARK HUGHES: The short answer is one hour because we do onboarding every year. Our big cycles airfield summer season. We usually get inspectors on in March, April. And they come in, they say, where's the documentation. We give him the iPad and say, your training is in one hour. You sit down, we'll show you how to use this. We have a process set up where we go through showing the plans. We only have one guy that continued to carry a notepad around with him. And all the rest, my rule was, if we got one of these, you can do this. And it has literally been the fastest adoption I have ever seen.
BRENDAN HILLON: When we first started rolling it out as a pilot, we got some grumbling. And then the training happened and less than three days later, I had inspectors banging on my door saying, I want this on my other projects. Well hold on, we're doing a pilot, we need to get all the way through the project to assess. No, no, no, I want this on all my other projects now.
MARK HUGHES: And it was interesting. It wasn't project managers, it was the inspectors. The guys in the field doing the work.
BRENDAN HILLON: Are you saying the project managers don't do work?
MARK HUGHES: It was the guys in the field doing the work. I have to say, I have not seen in many project managers out in the field with an iPad. I didn't say that.
BRENDAN HILLON: So the question is, if we're moving away from paper or are we also moving away from PDF. And as Mark said, the answer is no. There's always going to be a need for a stamped visual record document, whether that's drawings or specifications. Always going to be a requirement for that. When our QA inspectors do their daily reports, that report gets generated into a PDF at the end of the day automatically. And that becomes a system of record. The PDF is a system of record, period.
MARK HUGHES: For the city and county, as long as the data that's required is in there, we accept the PDF on the city side. Jurisdiction as far as building permit, and building inspection, they still require that stamp set. And we're not going to get away with that. But they would sure like to because on the hotel project, I know that after 87 bulletins, they ran out of physical load capability in the building to receive those documents. We had to shape them into the loading dock and they sat there on the loading dock because they could not bring them up to the upper levels. Because they were literally 3 and 1/2 feet tall. A full set, 6,800 sheets.
BRENDAN HILLON: And unsurprisingly, within a couple of years, they switched over to accepting PDFs on all projects. In fact, I think it's their requirement now.
MARK HUGHES: We're not at the point where Singapore is, where they are actually doing a full IFC submission on building jurisdiction submittals. But I don't know, maybe we'll get there.
BRENDAN HILLON: I think we have time for one more question. I did it over a weekend.
MARK HUGHES: Some of them we re-tweak.
BRENDAN HILLON: Some of them existed before, but they were really quick. I was fortunate to have really-- I had done Dynamo before that but not a whole lot. It took a weekend. And it had to be on the weekend because I didn't have any other time at that point.
MARK HUGHES: Anybody want to go take a nap?
BRENDAN HILLON: Yes
MARK HUGHES: Yes.
BRENDAN HILLON: Thank you all for coming.