Description
In this class, you will learn in details about various Construction workflows at JE Dunn and how the Autodesk technology stack, including Forge, is transforming the company culture byproviding access todata as a ubiquitous asset for project stakeholders, business drivers and key decision makers. (Joint AU/Forge DevCon class)."
Key Learnings
- Identify the challenges in typical construction workflows and how connected data using Forge can help overcome these challenges
- Gain in-depth understanding of the JE Dunn’s vision and how the firm is using innovative solutions and thinking to solve the problems that affect the entire construction Industry
- Gain overall understanding of Forge as a cloud platform, its integration capabilities, and how to use it to solve your own challenges
- Understand how the partnership between JE Dunn and the Autodesk Customer Success/Consulting Team has been critical to the continued success in working toward JE Dunn’s vision as a unified team—and turning the vision into reality
Speakers
- Mark StocksI am an Enterprise Architecture Director with a passion for developing and implementing strategies to improve business processes and systems. In my role, I work closely with various teams to analyze the current state of the organization and identify areas for improvement. I have a strong background in technology and have successfully led numerous projects to enhance the efficiency and effectiveness of our operations. I am constantly learning and staying up to date with the latest trends and best practices in my field. In my free time, I enjoy traveling, exercising and spending time with my family.
- SBSaikat BhattacharyaSaikat is a Senior Technical Consulting Manager with Autodesk Consulting, with a primary focus of integrations of enterprise applications with Autodesk’s Forge Platform and customizations of the Autodesk’s AEC products using the APIs. Prior to working with Autodesk Consulting, Saikat was a member of the AEC workgroup of the Autodesk Developer Network (ADN) team, providing evangelism, support, trainings and delivering technical presentations to the third party developers. He joined Autodesk in 2004 and has prior experience as GIS software developer and as a project architect with the construction industry. He holds a Bachelor’s degree in Architecture and a Master of Science degree from Rensselaer.
- DEBRAJ BANERJEEWorks for JE Dunn since 2014. His core areas: Information Architecture and Data Science, Enterprise apps & data-driven cloud services and Dissection of Microsoft Office & custom add-ins. Before joining JE Dunn he worked for Microsoft during 2007 - 2014 with various roles including Product Engineering, Technical Consulting, Enterprise Apps & Data Architecture and many internal CoE Research initiatives. Currently works on: integration of disparate systems, data aggregation, app dev, SOA Framework and data-analytics, dev/test/deploy automation of internal products and tools. Leads software development (PLM) for the Estimating and BIM/VDC systems @ JE Dunn. Favorites: Machine Learning and semantic computing Holds: Master of Science in Computer Applications and Information Technology. Loves to spend time with his wife Sarmistha, and two kids Sanandi and Sukalpa. LinkedIn: https://www.linkedin.com/in/debraj-banerjee-38b14419/
MARK STOCKS: So we're going to go ahead and get started. Apparently, there's a system issue, or the network went down out there. So they're having some technical difficulties. So we're just going to go ahead and get started.
So our talk today is going to be how JE Dunn is using Forge to connect our data in construction workflows. My name is Mark Stocks. I'm the director of information architecture and development for JE Dunn. I've been with the company for about 11 years.
DEBRAJ BANERJEE: This is Debraj. I'm the information architect at Microsoft [INAUDIBLE]. It's been like three years, or technically five years, with JE Dunn when we all started with Autodesk, JE Dunn, and Microsoft. My core areas there is data integration, aggregation, and automation.
SAIKAT BHATTACHARYA: Hello, everyone. My name is Saikat Brattacharya. I'm part of the Autodesk Consulting Group. I work as the technical consulting manager. My primary focus is AEC integrations and customizations for enterprise customers like yourselves.
I am based out of San Rafael in California. I've been with a company for 13 years. I've worked on different roles and engagements for the last few years.
MARK STOCKS: So quick background-- JE Dunn, we are a Kansas City headquartered company. We got about 20 offices throughout the US. We were started in 1923. And we are still family and employee owned. That's our headquarters right there.
Now, we're top 20 in almost ENR in our category for GCs. And this next number is actually my favorite number, because the average tenure of an IT employee at JE Dunn right now is 8.15 years. So the industry standard right now is 3.2, but we have 8.15 across the board.
And it speaks a lot about the volume of the type of company that we work for. So quick, I'm just going to run through a couple of projects that are actually completed right now. And this is some of that type of work that we're doing.
This right here is the University of Kansas hospital. This is the North Tower. So it was an expansion of an existing hospital. It's 377,000 square feet. And the architect was CannonDesign.
This is the University of Texas student housing. It's in Dallas in Richardson, Texas. And the architect was Jacobs. And it's about 74,000 square feet. And it was completed last December.
Now, this building is one of my favorite buildings. It's the Kansas City headquarters for Cerner. And this building is solely developers. So this is phase one and phase two.
Funny thing about this building, it had a $10 million change order for a cafeteria to get it added later. So it's pretty crazy for a change order. But this is phase one and phase two.
They're going to have two more buildings as well. And this is 100% all of the developers for Cerner are going to live in this one campus. It's a really cool area.
We're going to break down our scenario really quick by numbers. You're going to hear us talk about a product that we like to call a Dunn Dashboard. Dunn Dashboard is a Microsoft SharePoint website that is our project management website. This is what we do all of our integration in.
And based off of our workflows, we've created about 2,900 Dunn Dashboards. Now, 500 of those Dunn Dashboards right now are actual active projects that are getting worked on. And of those 500 Dunn Dashboards, we're actively syncing 800 projects into our HQ environment. And all that happens behind the scenes.
Now, our user base-- so we've got about 60,000 external users. Our internal company only has 3,000 employees, but we have 60,000 external users. 28,000 right now are working on these active projects.
And of those 28,000, we've got 20,000 that are actually getting synced over into HQ. And they're being run on the services within Autodesk. And this, again, happens all behind the scenes.
So our relationship with Autodesk started about five years ago. We had this idea for a cost estimating platform. And in that platform, we wanted to extend it and kind of connect it to the model within Revit.
And so we reached out to Autodesk. And that was about five years ago. That project was a huge success, and it really took us to the next level of connecting our data.
SAIKAT BHATTACHARYA: So our engagement with JE Dunn has been through the enterprise business agreement. What the EBA allowed both the teams to do is work towards the unified and overarching vision that JE Dunn always had. But for us to achieve division, we kind of took a step at a time. We focused on these integrations which performed measurable processes, improvements for the organization as well. So over the period of time, we gained the value of being a trusted JE Dunn partner through the engagement.
MARK STOCKS: So quick note though, when I go home and I'm on the cell phone, my 7-year-old daughter always asks me, are you talking to Saikat? And so that kind of shows you the type of relationship that we have. And we don't see Autodesk as a third party consulting company.
They are part of JE Dunn. They have hardhats. We give them swag. They are truly a part of our JE Dunn team.
SAIKAT BHATTACHARYA: Thank you, Mark. Thanks for that.
MARK STOCKS: OK. So this is what we're going to talk about today really quick. This is our agenda. We've got five areas of integration that we have completed.
So the first one is, obviously, Project Administration. That's part of the HQ. The second one is our Cost Estimating platform, which is what we refer to as Lens.
And then we have the third, which is Clash Coordination, how we're kind of handling clash coordination differently. Field View is number four. And the fifth one is how are we harvesting all of that data that's actually being created within Autodesk and pulling it back onto our system to enable better reporting.
So this is our landscape right now. This is kind of how we have our system architected. The gray area is all of our third party cloud environments.
So we harvest all of our Yammer data for sentiment. So we track everything that gets posted in Yammer, and we pull it back into our system. Zendesk is our third party help desk system. We do the same thing with Zendesk, pull all that data into our system. Then we trend it.
And then the big circle is, obviously, Autodesk, which is all the different services that we have with Autodesk. Now, we have two separate layers in the blue area, one what we refer to as Dunnpedia. And so Dunnpedia is our internal data warehouse.
It is the area that we take all of our third party data and some of our internal data, and we aggregate it into one centralized data warehouse. The other one is CMiC. CMiC is Oracle-based construction ERP. And it's something that we do all of our project management, all of our time tracking, all of our HR and our payroll data flows through CMiC. So we have a direct connection between CMiC and our Dunnpedia.
Now, the green area is all of our custom applications. So everything that we do we run through an API call. So all of our custom applications from our Estimating, which is the dark green, which is Lens, everything uses an API that talks back to our Dunnpedia interface. And so the green area is all of our third party applications.
So first level is project administration. Now, this is kind of the scope for our project administration integration. We created an internal application called TPS. I don't know if anybody's ever seen Office Space, but my boss had not.
So in a meeting in Denver, we sat down. And we were talking about creating this business workflow that could build a project like a project startup. And it had to, you know, go to Active Directory, had to create email boxes, had to build a website, had to go out to third party services.
So there was a huge workflow process. And someone, as a joke, said, we should call it Total Project Solutions, so we can have a TPS cover report. My boss did not know it. And he said, love the idea. So--
[LAUGHTER]
--at dinner that night, we actually showed him the video. And he's like, you know what? We'll go with it. And so it's kind of a internal joke at JE Dunn.
So everything starts with TPS for us. This is where you get a job number. This is, if you want to start charging time, you want to start charging money, anything, you've got to start a TPS. And it's the single location for everything that starts within our company.
So once that job gets created in TPS, we push it into our ERP system through a database package. And then, again, it resides in Oracle. When everything hits Oracle, we have database triggers that start firing off API calls.
And, again, everything we do is within APIs. So once that data hits Oracle and it goes into that API, we start firing off calls to Forge behind the scenes. So our users don't know anything that's getting created behind the scenes.
So we start creating the Forge layer. We start creating the project, the users. All of that data gets created the exact same way every single time.
Once that data lives there, we have another API layer, which is the second gray box. And that's what we're using for extracting some of our data for the analytics. So how many clash items, how many field issues, checklists, all that data is then pulled back into our Dunnpedia environment, so we can do better predictive analytics and use that data to essentially make better decisions.
So this is a screenshot. It kind of shows really bad up there. But this is a screenshot of our TPS application. And what this does, it allows us for a very simplistic way for a user to create a job number.
Now, behind the scenes, there is a lot of things that are happening. We're creating Active Directory accounts, Active Directory groups. We're creating stuff in Office 365.
We're creating stuff in Autodesk. But it's a single location that our users go. And they're able to create everything in one spot.
Now, the win with this, which is great, is it's a single source. But we're able to apply business rules off of this data. So I know that if a project is getting created in the Midwest or South Central, we would use a different template within Field.
So as the Field project gets created, we decide, well, if it's a South Central project, it's going to get this template. And then same thing for Glue. If the Glue project gets created and it's a Midwest template, we're going to add the Midwest VDC manager onto that project as an admin.
So all of that business rules behind the scenes get added and our end users don't need to know anything. All they know is that they're just going to click a blue button. And within next year, if we add docs, they'll just be another button there. Click that button, and our end users will be off and running.
SAIKAT BHATTACHARYA: So from the integration standpoint, this was achieved by consulting, stepping in, and assisting JE Dunn build the middle layer. This middle layer essentially uses the BIM 360 account administration APIs, which are the HQ APIs.
And these are REST-based APIs for some of you who might be aware of this already. We have performed integrations in the aspects of project initiation, like Mark talked about, activation of services, which essentially means you add in project admin programmatically to a service, like Glue or Field or Docs and so on and so forth, maybe unicorn in future. But it also does integrations to the user management, adding users programmatically into the member directory, as well as the business partner integration from JE Dunn, which, again, from TPS and CMiC now seamlessly gets integrated into the BIM 360 landscape with HQ being the umbrella for all our BIM 360 services.
One of the biggest testimonials and the quote that we heard from one of the job site engineers the other day was, hey, every time I'm assigned to a new project, everything seems to be already set up consistently. Now, it's getting to a point where I don't remember how to use Field to create a project anymore. I think that really is the power of these integrations where the user doesn't even see the magic that happens behind Dunn Dashboard with a single click of a button.
MARK STOCKS: So the next area that we're going to talk about is Lens, which Lens is our cost estimating platform. Here's a screenshot of Lens. And what Lens is is so Excel is the best mathematical calculation engine in the world, hands down. Excel is great.
Now, we use Excel for is a little bit different. We use it as a container. So we wrote a plug-in that sits on top of Excel. And we can harness the mathematical calculation for Excel, but all of our data that lives in our estimates actually live in a database.
And, again, it's Dunnpedia. So when you create an estimate, you start answering these questions. And there's about 100 questions that you have to answer. Square footage, building type, location, all of these different parameters are going to build mathematical calculations. And those algorithms are going to actually build a very detailed estimate based off of your building.
Now, this is about 30 years worth of mathematical calculations that we've been working on. And so though the Lens add-in is about five years old. It was converted from a VBScript that one of our estimators literally started writing on Excel 1.0 on a Mac. So this is the actual export of that data.
SAIKAT BHATTACHARYA: This is the same Cerner building.
MARK STOCKS: Yeah. That is the Cerner building. We love the Cerner building. So one of the cool things about Lens that we're able to do with, once we started our original project of connecting the cost estimation of a quantity with Revit, with a model-- so when you create an estimate, your calculations, your conceptual estimates are up here. But once we start connecting it to the model, we can start using actual quantities of the model. And our conceptual estimates start going down. And we start using actual quantities.
And so we start changing that classification of the model. And as that changes, we start using more of the model. Now, what you're seeing here is a different view.
So once we were able to connect those data sets together, we realized that we could publish that data out onto the web and create this immersive feel with LMV, which was another project that we worked on. And so here, you'll see the cost panel right here. And as a user is clicking on the cost over here, they get to really interact with the model.
This session was created from a desktop view. And so if I go in here and I push the View, it's going to actually push the model that the end user over here is actually viewing. So you notice he can interact. And as he's clicking around, he's able to look at the quantities of the model and actually dive into the details. And this is showing quantity and cost based off of the estimates that was connected from the Revit model to our Lens classification.
SAIKAT BHATTACHARYA: So from, again, the integration and development perspective, what consulting assisted with JE Dunn's team was, through a Revit plug-in that was built over the period of time. This plug-in essentially has custom UI, which help estimators do something that Mark mentioned, classify the elements based on the structure that JE Dunn owns and maintains. So these productivity tools enable JE Dunn's estimator team to go in every day and start classifying the model that they get from the design team.
But as you know, design also evolves as the time goes by. So if you actually develop some change management tools, so that every time a design estimation discussion happens and there's a new model that comes out of it, the estimate is going to essentially get all the delta between the work that they had done within few minutes and basically start from the new model. Now, once the estimation process goes through, the plug-in essentially also enables with a single click of a button to upload the design model with all the metadata into forward Forge through the model derivative and the data management APIs.
And from a consumption standpoint, when all of the design model and the metadata and Forge and from the JE Dunn on-premise data store get embedded in SharePoint, we achieved this through a custom [INAUDIBLE]. And that's the video that Mark showed, which is where we were able to, again, use the LMV JavaScript APIs to perform the unique mapping between the cost estimates and the design model elements to get the behavior that you saw in the previous video.
And on top of that, we built the collaboration session which enable the designer essentially sitting in Kansas City remotely, through a collaboration session, fire up a session where somebody who might be sitting in California would be able to log into the same session, scan a QR code, or maybe click on the hyperlink, and load up the model on their smartphone into a virtual reality stereographic view and kind of see the discussions.
There was less ambiguity about which element am I looking at, because everybody was looking at the same element even if they were remote. And the conversation was much more productive. And with the estimator sitting in Kansas City, the conversation definitely becomes more productive in that sort of a framework.
MARK STOCKS: So the next one we're going to talk about is clash. Now, everybody handles clash their own unique way. We feel like we kind of do it a little bit different.
What we do is we blend our clash data with our ERP data. So we can do clash assignment directly from SharePoint. But what we've also done with Dunnpedia is we've actually put business rules into our clash data.
So when we get a new item into our database, it gets a certain status. And based off of that item, we can change statuses behind the scenes and close items programmatically based off of the business rules that our VDC managers have set. So here's a quick example of how we do clash management.
So, again, you would come to the screen. And this is two data sets that we're blending. That was the ERP data with clash data. And so you click on a clash item. And because the model is in LMV and we have our clash data, we know the camera settings and the visibility settings of that clash item.
And so we're able to click on an item. And because the model is preloaded already, we're able to navigate over there very easily to jump from clash item to clash item in a very easy manner. And, again, we have that immersive feel. And all of our data sets are like this.
So we're able to click on that start virtual view, send out that link. And then we're very quickly able to create an immersive view for the end users. So everyone, even if they're in Kansas City or in Boston or the architects in Philadelphia, they can still get the exact same view. And we can push them into a point.
And so we use PubNub behind the scenes. So when you're in that view on the other side on the desktop and you want to navigate to a specific clash and they've clicked on the link, you just hit a button called Push, Push Camera. And all of the users that have logged into that session, it will send them to that exact area in the model.
Now, the great thing about this, it's just through a browser. It's HTML5. There's no software, no nothing you need. If you want to throw it into Google Cardboard, you can easily throw it into Google Cardboard.
SAIKAT BHATTACHARYA: So, again, from the integration perspective on the clash coordination system, this was a two-phased effort. The first phase essentially started about five years back where we took the design models that a VDC manager, for example, or a BIM manager would be receiving and then kind of running the clashes. We also built some productivity tools on top of AutoCAD, which would essentially consume some of the Navisworks data and help create these clash spheres, which enable the VDC manager, again, in a conversation or a triage, to essentially get everybody focused on the clash item.
So this was the phase one, where we essentially did a lot of the data mash up that the Navisworks, performed the triage by the VDC using Navisworks out of the box with some custom integrations with AutoCAD. But, eventually, there was, again, a single push button that it would enable the VDC manager to put the design models into Glue. And once it's in Glue, every night we developed a worker process, which runs at mid-nightly, harvests the Glue data into the JE Dunn data store along with the attributes of who was assigned this project, when was it closed. And this kind of feeds into the data analytics conversation that Mark had talked about.
So last year, what we did was we took the clash coordination workflow to the next step, which was from the worker process that runs every midnight, we started pushing the models directly into Forge. And that's essentially, again, using the same model derivative APIs and the data management APIs on Forge with a two-legged authentication. And then pushing these models, we also started using the Glue APIs to get the specific view information from the models in Glue, mash it up with the views that we get in LMV through the Forge viewer and, again, embed all of this in SharePoint giving you the experience that Mark showed in the previous video.
And what we did was, because all of this was done using the same framework that we developed for Lens, it enabled us to kind of leverage the same clash coordination workflow in a virtual reality environment. And again, to Mark's point, this is all web browser based. On LMV, it's essentially a VR extension that was developed for this integration. And we've essentially used all of that through browser to enable the viewing ability on VR.
MARK STOCKS: So the next area that we're talking about is our field view, which is a 2D to 3D integration. And this project really kind of started based off of we started looking at our data, and we realized that we have all this data for a certain group of our project team. But we really were leaving out the actual field guys.
Now, typical field guys do not want to get into a 3D view. They want they want to be in 2D all the time. And so we thought this was the best way that we could start getting these guys comfortable with 2D to 3D.
And what this is is, again, it's the same model. But we've created a second plug-in that actually builds a 2D version of the model. And the interaction is that you can start in the 2D view. And as you navigate the 2D drawing and click on a object, it's going to set the 3D object in the 3D view.
So if any of you guys have ever played a first person player video game, you can navigate this model really, really fast using your keyboard. And it's actually pretty fun. But this interaction is the same thing, you know?
We can create data points, saved points within the 3D view. So you can start that conversation early. I can go in. And I can say, I want to talk about these three design elements or design changes, go in there, save those points.
I can start my virtual view. So on here, I saved the sitting area and the doctor's office. So I went ahead and I pushed the camera. And as you flip back to the other browser, it's pushed that user into that doctor's office.
And, again, the user over here, if he was in a VR mode or he's on a desktop, he has control of what he wants to look at. But when we go back over here and I say, you know, I want to look at the sitting area again-- and I can push that camera-- it jumps his view and the other screen back to the sitting area. All these transaction calls are handled through PubNub.
And, again, this is all through SharePoint. So as long as you have access to our system and to that particular project, you can have access to this. And anyone can start a virtual view through this application.
SAIKAT BHATTACHARYA: So from the integration aspect of this, again, what we did was the 2D, 3D workflow was essentially enabled by the Revit concept of rooms. We started getting a lot of these design models, eventually, that we found didn't even have many of the rooms. So what we did was we built, again, some custom Revit plug-ins which would kind of scrub the design models.
It would kind of make it lighter weight for if there are redundant datas, kind of do a purge process. Make it lightweight. Embed or create rooms if they were not in existence, but also, add some bit of the metadata, which was important for the 2D, 3D mapping.
So, for example, which floor was this room at and a few other metadata. So all of this scrubbing was happened or done with the Revit plug-in. The users, the VDC manager, essentially, would be able to click one button in Revit, push the model, again back into Forge, and again the same services.
Because, again, rolling back to where TPS conversation started, entire framework, every workflow is tied to the unique job number. So whether we are looking at landslide, which was the cost estimation or the clash coordination workflow or the 2D, 3D, the unique identifier across all of this is the job code that is consistently and uniquely created from TPS upstream. So that common thread goes through every workflow and ties everything together.
MARK STOCKS: That's a great point, because when we have that consistent job number, we're able to roll this out onto every single job that needs to use this application. So, you know, those 800 active projects that we have right now, if we had a new extension that we wanted to add to this, we would deploy that code out into our farm. And every project that has an active job number would be able to use any of these features that we've built here.
So the next one that we're going to talk about is our field data, which is, essentially, data harvesting through the BIM 360. And, again, it's pretty simple to our current infrastructure. You've got the gray cloud, which is the Autodesk layer. And then we've got our enterprise data lake which is really Dunnpedia behind the scenes.
So we have these console applications that go out to all of the Autodesk APIs. And we extract every bit of data that we can get from Field, from Glue, from any of the services that we use. Now, specifically for Field, we ran into an issue with reporting.
So our QA/QC guy and the safety director, they really wanted to see all of our reports in a very specific manner. And so we have a very specific layer of how we do reporting across everything. So if it's from financial forecasting to trending RFIs to looking at safety data, we start at the region. We go to the office. And in the office, we go to the project exec, and then we go to the job.
Now, that data traditionally is not going to live in our Field dataset. And so we had to get that data into an on-prem environment, so we could run all of our analytics and all of our reporting through Tableau in a consisting reporting style. So the QA/QC director, he would spend, you know, a couple hours every Monday morning trying to get all this data, put it into a spreadsheet, and publish it out to a SharePoint environment.
So when we wrote the data harvest, it's automatic. There is nothing he has to do ever again. And he's able to actually, within Tableau, dive in and start finding outliers of subcontractors, projects that are having issues. He's able to figure out different predictive analytics based off of this Field data.
SAIKAT BHATTACHARYA: And, again, from the integration standpoint, this is probably one of the simplest things that has so much of a value addition and a return of investment. All we did was create-- again, going back to what Mark had mentioned-- the console app was a document integrator that we helped to develop JE Dunn. It was hosted on-prem for JE Dunn.
And it kind of runs every midnight. It uses basic Field API data to harvest data like issues, checklist, equipments, and so on and so forth and puts them into the JE Dunn data store. One of the other things that we have often done in these custom integrations, whether it's Lens conversation about the cost estimation, or clash coordination, or the 2D, 3D, or the Field Harvest is we, as Autodesk Consulting, we are still abstracted away from not having to know how they operate on their side of the fence.
And what I'm trying to get through is, if we need something that we really need to enable or use their datastore, JE Dunn builds a custom web service for us to use. We don't really have to go under the hood and understand how to get this data out. They abstract the layers between how [INAUDIBLE] pushes the model or looks on the development, so that we don't have to worry about their side. They have to worry less about our side. It's all the interaction through the REST APIs.
MARK STOCKS: So those are the five modules that we've actually completed right now. And those are running on almost every project except for the data harvest, which the data harvest is not a project specific. Because it's all our data for Field.
So one of the things that we're actually looking at next, I mean, obviously visualizations are a big thing for us. But what we're looking at right now is looking at schedules, how to take schedule and cost and the supply chain and all of those data points. Because right now, we have a great data set for when the project is started and the project is active in our system.
But what we want to start looking at is how does the schedule impact that. And how can we take that schedule and visualize that in an actual HoloLens or an HTC or even the Google Cardboard? But really identify, when we have schedule impacts, what's going to happen to the model? What's going to happen to cost? What's going to happen to units, time tracking? How does that data flow down to any part of our system?
So that's a kind of a huge project that we're looking at right now. But that's kind of where we're thinking about going. So, again, that's our presentation. Thank you guys for coming. I know you guys probably want to go to the key note pretty quick. But if you got any questions, let us know.
AUDIENCE: So what has been the cost to implement all this so far?
[LAUGHTER]
MARK STOCKS: Yeah. I don't--
AUDIENCE: [INAUDIBLE]
MARK STOCKS: I'm not going to answer that one.
[LAUGHTER]
AUDIENCE: How do you manage the return? Or how do you guys know that-- OK, I've implemented a few tracks, how do you know what that has returned to the company as far as how--
MARK STOCKS: So we have a great person in our accounting department that actually measures the ROI of any application or any service based off of what was the existing one. So we're able to define that pretty quickly. And we're able to actually create an ROI for all of our applications.
Now, the one, the Field View, there was no ROI for that, because it never really existed before. But for clash management, we had to analyze the way they were handling clash management at that rate, and then how we implemented it, what changed, and what efficiencies did they gain. So right now, we have it's a six day turnaround for clash items. Subcontractors manage all their clash objects. And they push them up into Glue.
And so our turnaround and our iteration for clash has been a huge lifesaver for us. And with predictive analytics, we've been able to identify design changes and changes that are going to impact the schedule. So our VDC manager actually found a spike in one of our data sets, called the project and realized that there was an issue with the bathrooms in a hotel. And so we were able to kind of adjust the schedule and make the designs changes better for it. I don't know.
AUDIENCE: For the models that you give from your consultants, how much processing are you doing? Or, I mean, I'm sure that you're getting a vast swath of quality. And you had mentioned always, you know, like [INAUDIBLE] the room. Like, why would we put a room in? But having to do that, do you do that all the time? I mean, how much leg work do you--
MARK STOCKS: How much work ahead of time do we have to work on classifying that model?
AUDIENCE: Yeah, after you do the model.
MARK STOCKS: Yeah. We don't dictate any design requirements or model designs for the architects. That's not something that we care about. Whatever model you give us, we're going to add our layer of classification on top of that model. And so we wrote a plug-in that actually handles this for us.
AUDIENCE: And so the delta, are you just taking the model and then rerunning it through your API?
MARK STOCKS: Yes.
AUDIENCE: And it just--
SAIKAT BHATTACHARYA: Yes. So what we do is we-- going back, certain elements are actually rebuilt, for example. The rooms is one example in the 2D, 3D workflow. But if there are design changes where a wall had moved, we kind of track geometry through the plug-in and say, this is probably the same thing.
And the change management, essentially, offers three options or four options for the end user, which is which are the new model elements that have been found in the new model? Which are the ones that have been deleted? And which are the ones that have been modified, whether it's a geometry modification or a metadata modification? And the user can essentially just click off a few buttons, either re-import the process or do whatever they want to do with it.
AUDIENCE: So are you tracking GUIDs in the elements?
SAIKAT BHATTACHARYA: It's a combination of things-- GUIDs, like I said geometry and certain extends. Sometimes we've actually gone to creating some partition walls in certain cases if it felt like, hey, this was a larger room, it's now a smaller room. So there is a little bit of-- so the plug-in development has evolved.
So the first, I think, instance of Debraj and our team working together on this was from 2013. We have definitely learned our lessons. And we are constantly building it.
What we have is with JE Dunn an active engagement on the plug-in, which is essentially kind of saying, hey, if we have to build enhancements, pretty much like you manage a product within Autodesk, which is you either fix some issues that you've found. And again, Revit, also, is a very vast data source where we have seen that some plug-in that was installed by one of the design team can actually corrupt the data. And by the time it comes to the VDC team, it's like, why isn't this working?
So we do have some effort that we have to put in as consulting with JE Dunn, assist in these critical ones. But over the period of time, we can make an informed decision of, do we really want to build additional capability on the plug-in to address these issues? Or are these edge conditions? Maybe it's a user training. So it's kind of an evolution that's still going on. And we do foresee this to go on for a future.
AUDIENCE: Are you stuff back into the Revit development? Like, oh, this is really horrible coming out of the data set.
SAIKAT BHATTACHARYA: Yes. So I think that has really come down. It was significantly high in terms of spikes. Revit definitely has matured a lot more. But, again, closer team, interactions, we are actively building up the workload for the Revit team and other teams.
AUDIENCE: [INAUDIBLE] support on this project, like this whole thing?
SAIKAT BHATTACHARYA: OK. So there are five solutions.
AUDIENCE: [INAUDIBLE]
SAIKAT BHATTACHARYA: Right. So there are five. Give or take, the consulting size of the team has varied from two people to five. That's as much as what we have gone through.
DEBRAJ BANERJEE: For five years.
MARK STOCKS: Yeah, the cost estimation platform is all him. All the Dunn Dashboard project work, even though, you know, technically I'm a director, it was all me. I wrote all that.
AUDIENCE: [INAUDIBLE]
MARK STOCKS: Thank you.
AUDIENCE: How do you handle the discrepancies between the 2D [INAUDIBLE]?
MARK STOCKS: It might be a better question for you. How do we handle the discrepancies from the 2D to the models?
SAIKAT BHATTACHARYA: Can you elaborate a little bit more?
AUDIENCE: Like the 2D contract drawings model, like [INAUDIBLE]--
MARK STOCKS: Well, those 2D drawings are actually outputted from the model.
[INTERPOSING VOICES]
AUDIENCE: [INAUDIBLE]
MARK STOCKS: Correct, correct. So that Field View right there, essentially, it's a new product that we're trying to get the Field guys used to using 3D to 2D. We're not expecting them to truly build from that model.
We have a second plans library that's our true contract document. Really, it's an application that we built almost for fun. Because we wanted to see if we could do it, and then to get the Field guys excited about using 3D objects. So I don't know if that really answers your question at not or close. Somewhat?
AUDIENCE: I'm assuming that the [INAUDIBLE] office is [INAUDIBLE] classic field--
MARK STOCKS: Yes.
AUDIENCE: --[INAUDIBLE] schedules [INAUDIBLE].
SAIKAT BHATTACHARYA: No. So I think the bottleneck there would be, because Field 2.0 is completely rewritten to be on top of Forge, even from a project perspective there isn't any planned projects. I think the way it will probably work is JE Dunn, like other construction companies, would tend to use BIM 360 Field 1.0 on the classic field for a lifetime, which is probably, I don't know, the timeline for it. But probably at some point, the newer projects will be in initiated in BIM 360 classic.
So once we have that, we'll probably have to do the harvest from both the data sources into JE Dunn data store. There isn't any project migration from classic to 2.0 that would be--
AUDIENCE: [INAUDIBLE] stop testing 2.0 already--
MARK STOCKS: Yup. We've been looking at it already. We have an initiative next year to actually do a really deep dive into it and see what it would take. But he's right. From a data harvest standpoint, we probably won't migrate the content.
AUDIENCE: How do you envision HFDM modifying your technology [INAUDIBLE]?
MARK STOCKS: What?
SAIKAT BHATTACHARYA: So HFDM is the Forge developer experience. I think it's a little premature. I've been going back to our texts over the evening saying, hey, look at this, another other shiny object. This is coming. Let's start thinking.
But that's really where we really haven't thought through a lot of it, primarily because it's still yet to come up to, in my mind, a level of maturity that we can actually say, OK, this is the timeline. This is what we are looking at. But it does take out a lot of the-- so, for example, a lot of this conversation that we had today was about hosting, right?
We talked about it hosted on-prem, or Azure, or somewhere else. I think we probably have to have very focused interactions next year about what does it impact not only in terms of the configuration ability, which, for example, the entire Revit plug-in that I talked about, in my mind, we might just be able to write all of that through HFDM. But are we there yet?
Often, we run into conversations where the estimator is actually at the airport. And they really want to get something out. They probably don't have internet connectivity.
So we definitely have to validate all these questions and see where HFDM might be prospects. But, yes, the seed has already been sown. We are thinking about the Revit plug-in will probably go away in the short-term or long-term with the HFDM capabilities.
AUDIENCE: How do you get value out of the clash data?
MARK STOCKS: The clash data, the value is the time to resolve. So I mean, the VDC manager manages all of the clash items. The data comes in nightly. On Tuesday morning, they have a meeting. And then the iteration process of them resolving the items to make design decisions faster is where we see the ROI on it.
SAIKAT BHATTACHARYA: And like Mark was saying, which was previously the vice president had to spend two to four hours every week prepping for this meeting. Now, it's zero time. So I think that significantly--
MARK STOCKS: I mean, the big win is our predictive analytics on that, too. So we want to see the resolution of items and new items coming in. We don't want that those two lines to deviate from each other. We want a pretty narrow gap.
And so all of our analytics go off of that gap. And based off of that gap is how healthy are we from a clash standpoint. And so if we see a divergence, anything that goes up, we've got an issue. And we can quickly dive into that issue.
SAIKAT BHATTACHARYA: And, again, going back to the issue, the other thing is, I mean, this is where it's a mutual win-win situation. They have access to the data transparently across different levels and stakeholders. If something is wrong, it could be a project issue. It could be the product issue.
It could be something that's hindering-- probably, it's a training requirement. But because you have access to the data and the analytics, it could be somewhere, another engagement, where consulting might be able to focus on a specific team and saying you're probably not using Field the right way. We probably need to tell you about setting up templates and doing X, Y, and Z from an implementation standpoint. I think that's where really the partnership becomes a win-win from both sides of the fence.
AUDIENCE: What were some of the key aspects during the communication phase that got this project off the ground? And how did you sell it to JE Dunn to spend all this money to make this platform before you had the internal accountant, you know, crunching ROI numbers?
MARK STOCKS: You apparently haven't met my CIO. John Jacobs is pretty-- he has a vision. And I share his vision.
And it was pretty easy for us to sell all the technology to our CEO and to the board. So we had an idea. I mean, we had an idea seven years ago about creating a Dunn Dashboard that was going to be a centralized location for all of our data.
When we created it, as opposed to what we have today, it's night and day. But the company is essentially behind us 100%. And they are willing for us to spend money to do R&D to play around with these tools to, essentially, find a product that's really going to have an investment or an impact on the company.
SAIKAT BHATTACHARYA: And building up on that as well, I mean, going back to the whole cost estimation story, even JE Dunn didn't really understand all the challenges that came up over the period of time, understandably so. Because once you build something together, you push this into production, you start looking at the project teams and figuring out, oh, there's another challenge. So that's what again the partnership steps into something else.
But this is kind of the relationship that you offer value with solving every step of the problem. And then, of course, there's never end, because there's a new set of challenges that you'll have to address. But, again, going back to what Mark said, it's easy for us, because John is the one who is the visionary here. And we are driving towards that same vision.
MARK STOCKS: Awesome. Any other questions? Yeah?
AUDIENCE: [INAUDIBLE] module [INAUDIBLE] you actually pull in that information for tracking purposes?
SAIKAT BHATTACHARYA: Not the daily reports in Field if that's where you're getting at. No, Field API doesn't have the ability. It does have the ability to work with the daily reports, but it has the permissions. Or maybe that's something that was enhanced recently. But I think from the JE Dunn integration, we are looking at equipments, issues, checklists.
DEBRAJ BANERJEE: Yeah. For Field, and BIM 360 Glue, there is a lot more things happening behind the scenes. So we have a traceability for every line item, for every project, for every clash item, every issue [INAUDIBLE]. What's happening? Is it a network error? Is it a project error? Is it a data error?
So if there is something wrong going on, I just send him an email with the an error message that this is coming up. And you know, oh, OK, there's bad data or there's something wrong happened.
MARK STOCKS: So CMiC has a daily reports module as well. And we've actually integrated that into a Dunn Dashboard to GPS for local weather. We have a mobile app for it already.
So we use that already for our daily reports. And we have other integrations that push data automatically into the daily reports that are behind the scenes. Awesome, any other questions? Cool. Thanks, guys. Appreciate it.
DEBRAJ BANERJEE: Thank you so much.
SAIKAT BHATTACHARYA: Thank you.
[APPLAUSE]
Tags
Product | |
Industries | |
Topics |