Description
Principaux enseignements
- Learn how we used a web app to engage MEP engineers in computational tools early in the design process.
- Learn how the Autodesk Platform Services platform was used to connect to a web app and produce intelligent models.
- Discover how we achieve more-sustainable designs by understanding the carbon impact of MEP design at concept stage.
Intervenant
- WHWilliam HamiltonHi everyone, I'm Will Hamilton, a Computational Designer/Developer from Sydney, currently working at Mott MacDonald. I have extensive experience working on major infrastructure projects across various sectors, including metro & rail, tunnels, stadia, aviation, commercial and healthcare, in both architectural and engineering practices. Within these projects, I thrive on finding ways to automate processes and help our engineers design more efficiently. I have a particular interest in Dynamo, C#, and web app development. I'm excited to share what my team and I have learned in developing the MEP configurator, especially the power of APS and the flexibility a web app provides to users!
WILLIAM HAMILTON: Hi, everyone. Welcome to our session today, where I'll be presenting our case study, where we're talking about Transforming MEP Design-- Bringing plant room configuration to the web with APS.
Let me start by introducing myself. My name is William Hamilton. I work for a company called Mott MacDonald, where I'm a digital developer. And I'm based in Sydney, Australia.
I have previous experience working in both architectural and engineering consultancies and would often consider myself a computational designer, working in lots of different automation optimisation and similar tasks to help engineers and architects work and design better. If you want to reach out to me, you got my LinkedIn up on the screen there. And I'm happy to have a chat if you'd like to [? call. ?]
Let me start this presentation by saying a quick thanks. So thank you, everybody, who's watching this. I appreciate your time, and I hope you get something out of it. Also, big thanks to Autodesk for hosting me and for having this conference. I'm very excited to be a part of it. And I hope this is a good contribution to it.
And, finally, I'd also like to thank all the team that helped put this together. So there are lots of different people working within Mott MacDonald to help us achieve this solution. And we wouldn't have been able to do it without the team effort.
So who is this presentation for? I think there's a little bit in this presentation for everyone. Firstly, if you are a software engineer or a computational designer, I think in this session, you'll be able to learn about integrating APS into a web app and its associated infrastructure. So we're going to talk about the technical details of how we built up a solution.
If you are an engineer or a designer, hopefully, this session helps you learn about how computational tools can empower your workflow, and you can save time and design better with these kind of solutions. And, also, if you consider yourself a leader or an innovator, hopefully, you can learn about how we have implemented a solution in response to a challenge within our engineering business and how you might be able to do something similar.
I'll quickly go through the learning objectives, which hopefully you can read online when you look at my session. Firstly, in the session, we're going to learn how we use the web app to engage MEP engineers in computational tools early in the design process. Secondly, we're going to learn how Autodesk Platform Services was used to connect to a web app and produce intelligent models.
And, thirdly, we're going to look at how we can achieve more sustainable designs by understanding the carbon impact of MEP design at concept stage. The way I'm going to structure this presentation is, first of all, we're going to talk about the challenges, so the things that brought up this development and why we even got here in the first place. Then we're going to take a deep dive into the solution, what we've built and the technical details around it. And then, finally, we'll touch on the outcome, what this has meant for us. So let's get stuck into it.
Let's start by talking about the challenge. So I think it's important to set the scene for why we've done this development. So this is in the context of MEP engineers and their workflow for designing plant rooms.
So, currently, a traditional or conventional workflow for an engineer might look something like this. They'll receive the project specifications for how they need to design the plant rooms in a building, the things that are required to go into it. Once they have this, they're going to run calculations to size various different pieces of equipment that need to go into the various rooms inside the building.
After this, they then need to go and check these sizing requirements against product catalogs of things that we can actually buy and procure to put into the building because just what we've done in the spreadsheet is not enough. It actually needs to match up to a real thing that a supplier is going to be able to produce.
Once that's done, an engineer typically is going to create a 2D mock-up of what the design will look like. Various different ways of doing this, but it's going to be sketched up in some form. And, simultaneously, but probably not connected to, there's going to be a schedule. So there's going to be a list of the different pieces of equipment and kit that need to go into the plant room.
So this is typically how a flow might go. But if a design change comes up, tough luck, back to the beginning. And this process is going to repeat. And that's the reality of concept design. Especially when an architect is in that phase of rapidly making changes and improvements to the design, the engineers need to respond quickly. And this process iterates over on itself quite frequently.
So why is this a problem? There's a few reasons why I think this is a big issue. Firstly, in this workflow, there's a lack of 3D spatial understanding. So at no point in this early concept design are the engineers getting into 3D and understanding how that impacts buildings and other clearances and things like that.
Secondly, it's very time consuming and repetitive. As I was just saying, it's a process that potentially has to be done over and over again. And there's nothing you can do about it. If something changes, you go back to the start and re-adapt your design.
Next is the issue that there's no metadata being passed along. And there's no linking between files. So if one file changes, like I was saying, maybe you have got drawings in one side, scheduling on another side, in this early concept phase, it's very hard to get these linked up to each other.
And, finally, there's no consideration for embodied carbon upfront, which is something that's pretty pressing and important. And I'm going to touch a little bit more on this one later in the presentation. I think just an interesting thing to note on and talk about here is that there-- a Revit-based solution would overcome a lot of these things. You've got a 3D environment. You've got metadata. You've got things linking between the schedules.
But a pretty big barrier to that is one that requires you to firstly even have Revit installed on your computer, which a lot of our engineers might not have. Then they need to be proficient at using it so they can actually model and get things that are useful. Also, I've noticed a bit of a hesitancy to enter Revit too early in the design phase because our engineers don't want to get bogged down in the BIM modeling when they're still rapidly making changes.
So this leads us to our next step. It feels like there's a bit of a gap in the market. How do we overcome this solution? This conventional design process obviously has some flaws in it, but it doesn't seem like there are any tools on the market or any existing workflows that allow us to overcome this problem. So this problem sounds like it calls for a solution. Let me talk you now through how we got to developing our solution to developing our MEP configurator.
So over the past year or so, we've tested a couple of different iterations of configurator tools. And we've tried building them as web apps as a place to have them hosted. But a big challenge we had with this solution is it's very hard to get from the web to a useful 3D model or a BIM model. And within our tests, we could often export simple objects like an OBJ file. But you've got problems where you don't have the right metadata, and it can't pass onto the workflows of other users who are in Revit and so on.
Next in the timeline of things that have happened, at the end of 2023, we had a session run at Mott MacDonald by the Autodesk team. And they ran a training with us to help us upskill and learn about using APS, Autodesk Platform Services.
And this was actually a very useful session. It went over a few days, actually a week, and I'd would like to give a quick shout out to the team who did that. So there was Riley Peterson, Keith White, and Todd Smith. And the three of them provided some really good insights into how we could leverage APS and also just hands-on of actually how do we get stuck in and start using it.
So following this, we sort of started putting the pieces together. And with our newfound knowledge of how to use APS, we had this idea, OK, we can connect these two things together. We've been testing using a web app for designing these mechanical plant rooms, and we've got this APS tool, which allows us to remotely connect to Revit. What if we put these together? And that takes us to our MEP plant room configurator.
So what is it? This tool that we've built is a web-based tool for engineers to design and configure plant rooms according to specifications and standards. Then we have leveraged Autodesk Platform Services to rapidly push data-rich models to Revit at a concept design stage, which seriously elevates the engineer's ability to design quickly and produce significantly higher quality outputs. So those are the main two things to keep in your mind about what's happening in this flow. We've got a web app. And, eventually, we end up in a Revit model.
Let me just share this video with you now. I think the best way we can handle this is if I give a demo. So I've recorded a video of what this tool looks like. I'm just going to talk through it as we go. So you can understand what's happening in this process.
So on the screen here, we have what the app looks like. The engineer or the user comes in. They're in the browser. And they've got basically a home page where they can configure their list of plant rooms that belong to a project.
The user can then say they'd like to add a plant room. They click on it. They decide what discipline it is and what room type they're going to make. In the example right now, we're making an air handling unit room. You assign some basic information to it. And as soon as you start saying how many pieces of equipment you want in it, you get an instant response, a 3D visualization.
Then in your field, you've got the ability to control things, your input parameters like the capacity, and the units instantly respond and size themselves accordingly. And they also place themselves understanding their clearances and tolerances of how they need to respect the other units in the room. Let's look at a second example, the generator room, which is a little bit more complex and a bit more exciting to look at.
So as you can see, as soon as you ask for the amount of units you want, it starts sizing itself. And this room's got additional pieces that go in it. So it's got the intake and exhaust attenuators and the day tanks and all the other pieces of kit that go with it. And you can do things like add plenums so that the wall size accordingly.
What we've also added-- after we've done that, we can come onto our second half of the app where we can also configure risers. So risers are the vertical elements of the building, which allow the services to run up and down. So, similarly, the user can come in. They can input some basic information. Here they're going to define the shaft, which holds potentially multiple service risers. And then they're going to assign the particular service risers that go inside so they can put in things like cable trays, ducts, and pipes.
So right now, I'm just going through and putting in some information that tells the risers where they need to go, which floors they're on, where they go, where their takeoffs are at. And then on our next page, the user fills in some information. And so this page has got a bit more, but it's because we need to supply information about what's going on at each takeoff level.
Once that's done, again, you've got an instant response in 2D and in 3D. So the user can get immediate design feedback based on the input values they've got and if they're going to fit in clearance zones and so on.
Then once you've done that, the user comes back to their home page. And they hit Export Your Project. And this is where the APS part happens. You're going to hit Export to Revit. The app runs for a little bit, normally about one to two minutes, but I've cut it short.
And after running for a few seconds in this video, we'll get a little pop up that says your Revit file is here, have a download. There we go. And then from the browser, we've been able to size, design, create our plant rooms and then get a Revit model output. So I'll hop on to the next slide. Let's have a look at what that Revit model looks like.
So here we are. We're in Revit. I've just clicked open up that link that we saw. And you can see in Revit we have those plant rooms.
Their models have come in. They've got their walls, their families. They've got their levels, their place on the correct levels.
Here's that generator plant room. You can see all the clearances are in there. And then, finally, we also have the riser. It's been pulled in. Those are some cable tray and duct families.
So, yeah, there is an overview of what the tool does, a little bit of a demonstration. And now what I'm going to do is I'm going to step through in a bit more detail how we actually build that up so hopefully you can learn some more of the technical details.
Let me show our tech stack. So this is a diagram that shows all the different components that come together to build up this app. And I realize it's probably a little bit overwhelming [INAUDIBLE] these kind of diagrams always are.
There's a lot of information to take in, but don't worry, I'm going to bring this up many more times in the future. And we're going to use this to take little deep dives into individual components of the tool. So this diagram will be coming back, don't worry.
Let's break it down. So to start with, let's focus on the input or the user experience of the tool. So our front end is built with React. For those colorful 3D visualizations you saw, we're using three.js. And so that was essential for us to be able to have an instant response in the browser to be able to visualize changes that were happening from our inputs.
Another part of our user experience is that there's a clear flow. So you define your project. You calibrate your plant rooms. You calibrate your risers, and then you export.
But it's very simple to also reload, come back into a session, delete some things, edit some things. Any time you want, you hit Export. You can get another model. Export again, export again, and off you go.
Another nice benefit of using React as our front end is just in this screenshot, you could see that part where we're inputting our fields. It allows us to dynamically create fields and inputs and information. So when particular units come in, if they've got requirements that are specific to them-- so, for example, the air handling unit has got a property called its coil velocity, and it means that we can dynamically render and make different fields appear for the user to interact with.
Now, let's have a look at our back end and the main logic that's happening inside the app. So to start with, let's talk about the business logic or the stacking logic. So this was done with TypeScript. And this sort of belongs under React.
And what this involved is quite different, quite a lot of components that go into it. So within a given plant room type, there will be rules about what type of equipment goes in it and how many pieces are allowed to go in it.
After that's solved, there are also specific calculations for sizing the equipment pieces based on their input requirements like that example I gave before, so the coil velocity. There's other things like the capacity, which side the connectors are going to be on, if it's a horizontal or vertical unit, and so on. So a lot of different factors change how these things get sized.
Once we've sized it based on the input that the engineer has provided, we also get the associated clearances that go with it because different units, based on their size and their specification, have got different clearance spaces on different sides of them and different sizes. And there's even different types of clearances because there are some clearances that are allowed to overlap with other ones, and there are some that are just big no-go zones. You can't put anything else in them.
So then after we've sized these, the configurator then works out how to assemble the different pieces you've asked for. In the example on the screen, the air handling unit is very simple. Its logic is just that each unit should alternate, flipping which way it's facing, with an alignment to the center of the largest one. But obviously, with other ones like the generator [? and ?] we looked at, there's more complex logic.
There's also logic for additional things. For example, that yellow box you can see at the top. We've got a service clearance zone, and that is also formula-driven based on a size of the biggest unit.
Additionally to our stacking logic, there's also the logic that goes into the risers that's driven based on the floor-to-floor heights of your building, the nominated structural zone, nominated clearance zone and the capacities you ask that your different ducts and pipes to have. And then they can get sized accordingly and check that they're going to fit into the service's ceiling zone.
As a part of getting this stacking logic and business logic ready, a big part of it was actually codifying what was in our engineers' heads and turning it into machine-readable code. So obviously, engineers know inside out about the different plant room types, how they should be arranged, the things that go in them, but it's actually quite a difficult exercise to turn that into code that the software or the tool can understand to configure the plant rooms. But despite the fact that this is quite complex, there's actually a lot of commonality in plant rooms, and this is why we found this to be a very fruitful exercise and a useful tool, because even across regions, across different disciplines, there's a lot of commonality between how you configure a plant room. Because in the end, a lot of these different pieces, even if they come from different suppliers in different parts of the world, the kind of rules about how they should be arranged have a lot of similarities to them.
But in this exercise with communicating with our engineers, we needed to come up with a common language of how we could actually get the ideas of how things should be configured. So we ended up with lots of diagrams and drawings like this, where we set up standard schemas and drawings where we'd have different variables. So from a coding perspective, we've got our variables, and these align to different parameters from a supplier's catalog, the rules for which way things need to face, how big their clearances are, and so on. And so we worked back and forth with the engineers to understand how they need their plant rooms to be configured and what the rules are that are baked into those.
And then as a final part of this little back end step, there's the actual back end of the code, which is .NET and written in C#. And this is where we process the data and make our specific APS API calls, which we'll talk about in a little bit. And we can also make other API calls from the back end.
As an example, one other thing that we do in the tool is we've got a metrics tracker that understands who's using it, where they're using it, and how they're using it, and that gets sent off as a call from the back end every time this happens. So the back end has got a series of HTTP triggers to request a particular action from the front end. So when the front end wants something to happen, it sends a call to the back end and then it goes and executes the appropriate code.
Back end's also responsible for handling transformation of the equipment model data because we have lots of different formats our data can go into, so we had to devise a standard schema for our object data. And then the back end processes, transforming that into different datas depending where it's going on. So for example, the one transformation is that it needs to go into the Three.js, which is the visualizer, but then that same information, we don't want to recreate it. Instead, we pass it to what will eventually become the Revit data, and the back end handles that transformation.
All right, let's go on to the next step. The next step is talking about this is a thing called Moata Workspaces, which is an internal solution for hosting our web app. I'll just touch briefly on this one. Actually, the main thing that I would like to address here is the fact that this tool is a web app, and this is, in my opinion, probably the coolest and most useful part about it because it means that this design is in the browser, and you don't need any modeling software. The engineers can come in, log into the app, and start using it.
A little bit more of a deep dive into the technical part of that is that we've got a Docker file that builds the web app. The code's stored on GitHub, and we have a GitHub action that triggers each time your code gets pushed. So as the developers, if we need to implement some new features, we push the code into GitHub, the action triggers, and then the new build goes live once we've approved it. We also, because of that internal solution where we've used the Workspaces, we've got an OAuth, which means that you have to be an internal MacDonald user, and it means that your data is protected and it's secure that only our users can log into it.
All right, next step, we have the Azure storage. So this section, I'm going to touch briefly on about the database. So we used Azure to store our information, and so there's two parts to this. One is there's the equipment catalogs. So in Azure, we have Azure tables, which store the information about our equipment, where we have standard schemas for the information they have to have. So for example, at a basic level, it has to have width, length and height, and it has to have information about its clearances on its various different sides.
But it does allow us for us to have products that have come from different suppliers, that belong to different regions, and so on. And so we can build up those tables and populate them with the information in their appropriate location. And the other thing about having these tables is it means that they sit separately from the code. And that means that engineers can come in and edit and access this.
And this is great that if a supplier updates their product-- maybe something is no longer available, maybe some specks of changed, dimensions have changed-- The engineer can come along and make an update in the table, and we don't have to rebuild and push the code and potentially break something. So that sits in a separate environment.
The second part to the database component of this project is that we use Azure SQL database to save and reload the sessions. So when a designer works on the project, configure your plant rooms. When you're done, hit Save. It writes to the database, and when you reload, when you come back into the home page, you can boot up and configure and go back into an old one and configure it. And in terms of actually getting the data in, this is, again, something that the back end does. So when you first boot up the app, the back end goes and pulls the information about the saved sessions, and it pulls the appropriate information from the tables so that they're in there for the app to access.
All right, next, and probably one that most people are excited about. Let's talk about Autodesk Platform Services. This is a big component of our app.
So let's talk about what's happening here. The main component of our Autodesk Platform Services integration into this app is using the Design Automation API. And before I even actually get stuck into too much detail about this, I would like to suggest anyone who's viewing this, that if you want to do something similar to this, if you would like to use the Design Automation API, go use the tutorials and the docs that Autodesk have online. Super useful, and a lot of the boilerplate stuff that's on there is the foundation for the code that we're using. It's a great place to get started.
So for those unfamiliar with the Design Automation API, what it is is it allows you to run Revit plugins completely in the cloud. So imagine any function you'd have in your Revit model. You know when you got the bar up the top, and you click a button and someone's developed a special little function that does something, we can do that but without even opening a Revit. We don't have a Revit installed on our computer. We don't have to click any buttons. Design Automation does the magic behind the scenes.
So the process for making that happen is we create that plugin, similar to how you'd expect a button to be at the top in the tab, but that gets uploaded to the design automation with an app bundle and an activity. And then what happens is that whenever-- remember, if you think back to when we were watching that video of how the user interacts with the tool, whenever you click I'd like to download my Revit file please, the front end sends a trigger to the back end and says, we need to go and call Design Automation API. Go make me a Revit file.
When that happens, part of what needs to get passed from the front end, because the front end currently holds all the information, it needs to pass the information about the plant rooms and the equipment in them to the back end. And so we do that by making a JSON payload, which has got the information about the different parameters in the object. And then that payload gets sent over, and it begins the design automation script.
So I think I already just touched on this, the fact that the back end then takes in that JSON schema and then passes it into a format that the Revit families can read. So anything that comes in from that JSON, the different lines in our JSON object will correspond to parameters in our Revit family. So again, we've got those basic ones like width, length, and height, coordinates and sizing of different things.
Once that data's there in our back end and it's been configured, we post a work item, which gets sent off into the Design Automation. And then what it does is, in the background, it does the magic. So in terms of what actually happens in our plugin, it boots up a blank Revit file using one of our internal templates for our Revit standard, and then it just goes through, using the information it has, and places the objects in. Puts the walls in the right spot. Puts the families in the right spot. Spins them around, stretches their parameters, and so on.
Then what we need to do is we need to get our file back. So in this part, we need to use the Data Management API, and we create a thing called a bucket, which is somewhere where we can store a file. And what's going to happen is as the automation is running, so we've created our bucket and the automation is still running, so it's building our Revit model. It's going on in the background. It's also going to report the status of where it's up to.
And this is actually quite important because we ran into an issue when we first developed our connection from the web app to APS, where our front end was timing out. It was having an issue, because like I said, it's about one to two minutes for this automation to get done for a pretty decently sized Revit file. But I think our app was timing out after 30 seconds when it was directly just waiting for a response.
So instead, we have to utilize the fact that the automation also reports a status if it's completed or not. And so our front end after you've clicked I'd like to download, it just waits. And then every tens, asynchronously, it pings the back end and says, Hello, are you ready to download yet? No.
Tries again in ten seconds. Hello, are you ready yet? And it will keep doing that until it gets an all clear from the back end.
And then once that's happened, once it's got all clear, the automation pushes the Revit file into the bucket. So it's ready for download. And then the user on the front end gets that little pop-up tab at the top with a link ready to download.
Finally, I want to talk about one other thing that we've tested utilizing APS for, which is quite fun and interesting experiment, was using the Design Automation API to programmatically generate drawings. So one of the things is that we often have to issue the drawings for the design early on, and we figured, OK, why don't we use rabbits built in capabilities, sheets, and drawings to create them from our app?
And so this automation works super similarly to how our main one works, where it boots up a Revit file in the cloud, puts all the pieces in, but additionally, it creates views, sheets using our templates, and then it prints them to PDF. It zips them up in a file, and very similarly, it, how our main Revit file worked, puts them in a bucket and serves them to the user. So if they need a PDF download, again, you don't even have to have Revit on your computer for this to work. You can ask it go open up Revit in the cloud, make me PDFs, and bring them back, which is pretty cool.
All right, that concludes the little APS section. Only got a couple more things to look at now. Let's briefly touch on how we use in this, which is the Autodesk Construction Cloud.
So this is where our families live. So as part of that automation, we need to actually know which families to put in. So we've got a dedicated ACC hub, which is where those families live, along with the template file for Revit and for those drawings. All those things live in our ACC hub. And we use the design management API again, which is the same one as we were using for the buckets to access those files.
So when you run the automation, the script goes and gets the files from ACC, brings them into the Revit model, and instantiates them. And the great thing about this solution is that it means that we can easily deploy updates similarly to how I was saying before with the tables, how they're one step removed from the main code. It means that if we want to implement improvements to these families, we just upload new ones into ACC. Or if a different region's got some specific things, they upload it.
Or a big use for it is that as the LOD progresses, as the detail in our model gets more developed, as the design goes along, we can simply substitute these families for better ones. And then as long as they're up on ACC, our app can go and find them. It knows its same parameters, brings them in, and we'll have a more rich and impressive and data-detailed model.
All right, and then our final part of the tech stack that we're going to look at, and this was something I said we're going to focus on at the start, was looking at embodied carbon as part of our designs. So embodied carbon. I think at this point everybody probably knows how important it is as part of the AEC industry to be considering embodied carbon as part of their designs, so I'm not going to explain that to you again. You probably heard it in other presentations. You probably had speakers come to you at work. You probably have initiatives at work to consider this.
But let's talk about how using a solution like this can actually meaningfully contribute to the embodied carbon impact of our designs. So I would say conventional workflows tend to either not account for carbon at all or are simply doing a carbon accounting exercise very late in the process. And the reason for this is you need data-rich models to be able to actually do some sort of carbon assessment.
And so typically, if we've got that conventional workflow of sketches and not much metadata at the start, that's not happening. We can't do any embodied carbon assessment. But with our tool, by giving the engineers the ability to instantly get spatial data, metadata, the elements in a design, we can embed embodied carbon information into it so you can start making carbon-driven designs.
And part of the way we do this is one of our internal solutions that's also part of the Moata ecosystem, Moata Carbon Portal, which has a big catalog of different carbon assets from different regions and different stages of carbon lifecycle, and we use that to get our carbon calculation values. And so it has a bunch of APIs endpoints. And the tool, again, when it runs, it goes and requests from this service, and it says, I've got these elements in my design. I've got some generators. I've got these materials. Please tell me how much embodied carbon is in them.
And so they get stored in this carbon portal, but it also gets returned to our app as a value that can be embedded into those elements. And so we can have those colorful diagrams, and we can understand where the hotspots of our carbon are, and we can report it so we can make informed design decisions about it.
All right, let's briefly talk about impact. So now that we've seen the tool, let's talk about what's great about it. So this changes how engineers work. So they can get quality outcomes faster.
They can option here. They can design. They can be in their browser, no software installed. Like I said, remember that. That's the really cool part. You're in the browser, you're doing engineering design, and you get a 3D model out at the end.
And also briefly, let's talk about how we would like to upscale this. So at the time of this presentation, this is still very much a work in progress, so this is something we're actively developing at the moment, and there's still heaps to do on it. So we want to keep expanding our compatibility with different regions and different requirements they might have. We want to add in way more rooms. We want to improve the quality of our families and our elements.
But sort of a positive about this is that we've done all the heavy lifting now. We've got that infrastructure set up. We've been robust in how we've created our APS automations and our front end, and so now it's the easy part of making lots more logic and populating it with way more different types of rooms and risers that we can do.
Let's do a very quick recap of what we've talked about today, and then we'll be coming up to the end. So to start with, we had an opportunity. There was a situation where I would say the current status quo for designing plant rooms is inefficient. It's missing a lot of things and tools aren't being properly leveraged, and we don't have enough information in our data.
As a response to that, we've designed our MEP Plant Room Configurator. And as part of that, we're leveraging APS to then get from the web to a Revit model with a metadata in it. And as a result of that, our engineers are designing better. They're empowered to model from their browser, and we're considering embodied carbon really early on in the design phase.
We've got a Q&A here. Obviously, this is a recording, so we don't have any live Q&A. But if you'd like to ask me any questions, feel free to reach out. Like I said, you can find me on LinkedIn. My name's William Hamilton. You can also reach me on my work email, which is william.hamiltion@mottmac.com, and I'd be more than happy to talk about the project, also talk about APS and how you can integrate that and use it.
That brings me to the end of the presentation. Thank you very much for listening. I hope you learned something and enjoy the rest of the conference.
Downloads
Étiquettes
Produit | |
Secteurs d'activité | |
Thèmes |