AU Class
AU Class
class - AU

LEEDing with Innovation: BIM + AI Formula for Enhanced Compliance

Diesen Kurs teilen

Beschreibung

This session will explore the innovative integration of building information modeling (BIM) and artificial intelligence (AI) in advancing sustainable design, focusing on achieving LEED certification through an AI-powered BIM Process. The merging of AI with BIM marks a significant leap in design precision, efficiency, and sustainability, setting new industry benchmarks. This integration transforms the approach to LEED certification, using AI-driven analytics within BIM tools to make sustainable design accessible and effective. Through practical examples and case studies, attendees will gain insights into the practical application of different technologies—such as chatbots and dynamic dashboards—in real-world projects. This session aims to equip professionals with the knowledge and tools to use the synergy between BIM and AI, pushing the boundaries of what's possible in sustainable design.

Wichtige Erkenntnisse

  • Learn about the complexities of adopting AI and digital twins in BIM workflows.
  • Learn how to use AI for automated LEED-compliance tracking and AI-powered assistants for calculations and bulk document analysis.
  • Learn how the combination of AI and a digital twin was used to benchmark an algorithm-based lighting control to achieve cost savings.

Referent

  • Leo Salce
    Leo Salcé is a seasoned Technology Strategist with over two decades of experience in design, technology consulting, and research and development. As the CEO and principal of Avant Leap, he spearheads the company's strategy for delivering innovative building life-cycle technology services and solutions. Leo has successfully partnered with numerous organizations across diverse sectors and industries worldwide. His expertise lies in implementing cutting-edge technologies that bridge Building Information Modeling (BIM), IoT, and Artificial Intelligence (AI). Leo's collaborative efforts with building owners focus on optimizing operational and maintenance practices through initiatives like predictive maintenance, and data-driven decision-making using digital twins and real-time building insights. Leo holds degrees in Architecture and Computer Science from Universidad Católica Madre y Maestra and is a licensed architect in the Dominican Republic. He has consulted for hundreds of AECO organizations globally, providing strategic guidance and innovative solutions. He is an educator, author, and lecturer specializing in BIM, Sustainable Design, Digital Twins, and the integration of IoT and Artificial Intelligence within the AECO industry. As a frequent public speaker, Leo presents at international conferences such as Autodesk University, BILT, AEC Next, Digital Built Week, Ashrae, Ecobuild, Techbuild, Constructech, and more, delivering talks designed to inspire and educate.
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
  • Chapters
  • descriptions off, selected
  • subtitles off, selected
      Transcript

      LEO SALCÉ: Hello, everybody. My name is Leo Salcé, and I'll do a little bit of presentation in a second. But my session is on LEEDing with Innovation, combination of BIM plus AI formula for enhanced compliance.

      To give you a little bit of background before we start, we'll talk about, LEED and other green building rating systems' compliance, how BIM model processes can be used for tracking, monitoring, and helping with the compliance, beyond that, how to use not just certain use cases for the model-based approach but also digital twins that go for existing applications, existing conditions, as well as IoT and integrating AI into the process.

      And that's when we'll talk about automation tools and AI-powered tools that assist with not just LEED but any other potential green building rating system in terms of documentation, compliance, and getting certified. And then of course, address the challenges and, not included on the agenda, talk about the future of some testing scenarios and exploratory examples that I'll share shortly.

      To start, my name is Leo Salc. I'm a architect by trade and software developer, being 20 years in consulting, advisory, and working with companies around the world. Last year, I've been fortunate to work on innovative initiatives around digital twins for different intents, different purposes, for the whole life cycle, operations and maintenance but into construction and starting early in the design. Talked about this topic and sustainability for a while around the AECO industry but in different conferences and events throughout the world.

      And whenever I have time, then I get the chance to write and publish something, which I haven't done in a while, but definitely passionate about the topic, definitely passionate about how technology can actually bring value to processes and to companies. And in this particular area, it's really very, very relevant to what's going on in terms of climate change and the things that are happening around anything sustainability wise.

      So definitely a little bit of background about the company, so I run Avant Leap, which is a technology consulting company for ACO. So we provide services and solutions. And basically what I just said about digital twins, we are responsible for overseeing the strategy, the execution, or the various segmented initiatives for digital twins, combining different data sources, develop solutions that may not exist, automate or integrate into different systems, as well as put practical AI into processes. But then overall is how to optimize the project execution and the project delivery.

      We've been around for about six years, based in California but also in the Middle East, and have projects all over the world. Here's just a snapshot of some companies that range in size and the type of stakeholders, from A&E, design, construction, and into operations and life cycle. So the topic at hand is LEEDing with Innovation.

      So LEED is the only one out of many different systems that are used worldwide. But in a nutshell, and just highlight a few, a green building rating or certification system broadens the focus beyond the product to consider the project as a whole. So basically it allows to certify systems that rates or rewards relative levels of compliance or performance.

      So again, a lot of it is, which one is-- depending on the location, which one is worth pursuing, which one is mandated based on maybe if it's new construction, if it's a certain type of building or square size, square feet size. And there's other questions like, which one is more affordable? But overall, it can be overwhelming.

      And it's definitely clear that a lot of these initially were targeted architectural firms, but definitely they have a broader part in construction and design and construction and beyond on the life cycle side. Especially the new version of LEED point-- version 5 is coming out, that one is going to have heavy focus on operational and maintenance.

      So in specific to LEED, which is the most utilized rating system in the US, the areas of performance which buildings earn LEED points are basically the same for the past few years, sustainable sites, anything related within that area, water efficiency for indoor or outdoors, energy and atmosphere that has a number of things regarding building performance, energy, material reuse or material resources, which has a number of expanded credits, indoor environmental quality, innovation regional credits, location, transportation, and so on.

      But this highlights in a summary what the focus is. And it's all a point-based system, where based on meeting a particular achievable goal, then you qualify for points. And each one has their own structure in terms of the categories that I mentioned.

      For LEED, right now, there are different certification levels. So from 40 points, which, by the way, when I get to the model-based approach for monitoring and tracking, you can actually achieve LEED certified by default. You have enough credits, points that you can achieve just by using a model for purposes of monitoring and producing documentation that is needed for submittal. You can get all the way to platinum, which, again, beyond 80 points.

      But overall, there's been a progression and evolution of the system. I'm highlighting here the current one, which is version 4.1. LEED 5, is going to introduce several significant updates compared to the current version, focusing on improving sustainability, resilience, and social equity.

      Some of the major changes that you're going to see are innovation and emerging technologies, which the topic today actually applies heavily, especially when you talk about smart buildings and digital twins for being able to make decisions that are more informed and allow for better tracking, not just on the design side but also on the operational side.

      So I always say start with the end in mind. So there's a whole life cycle approach here that is very relevant, which I'll share in a little bit. But building performance, monitoring, verification, which has been the case, carbon and energy focus, but definitely there's going to be more focus on advanced energy modeling, data transparency, ongoing performance based on outcomes, where buildings are continuously monitored for energy, water, indoor quality, indoor environmental quality, like comfort level, and a number of other things, but also carbon performance, operational carbon, and not just embodied carbon that is being the main focus.

      So there's going to be a lot of changes. Here's just a snapshot of the area that is going to be very valuable for leveraging digital twins, which is assisting on that end side of the project life cycle after handover and when operation starts.

      So there's a blend of technologies that are going to be very valuable that we're going to showcase today, BIM, IoT, AI, combined, and then how to integrate those and how to make that into meaningful insights that can provide better tools to make better informed decisions and get to a level of prediction, where it's not only, what can you do, what I recommend what I should do?

      So here's just a snapshot highlight of some of the areas that are key or can be-- that are going to be very valuable or relevant to digital twins. I highlighted the ones that-- again, this is not finalized, but to start preparing, it's something that definitely is worth considering, exploring, and getting prepared for today. So in the end, it's almost like the concept of leveraging building information modeling when, at the end, you produce PDF drawings.

      So it kind of beats the purpose of using this process that is more advanced, and then, in the end, you submit a 2D drawing. So it's the same idea here with LEED. There's a lot of things that are very innovative in terms of simulations and things that can help validate the approach of achieving a particular sustainability goal. But then, in the end, you have to put all this together in a form and then just manually submit all these things.

      There's some level of automation that I'll explain later. But you need the LEED forms. For each credit or prerequisite, complete the associated online form provided by LEED Online, which is the portal where you submit this for any particular project to get certified on any level. And then, of course, you got to provide supporting documentation, upload documents, like drawings, calculations, data sheets, for each of the credits and the prerequisites.

      So most people that are seeing this class, they probably are familiar with the LEED process. But if not, again, it's very repetitive and very manual, in a way, process, where it may require somebody overseeing the entire submittal throughout the design phase, starting from the beginning.

      So when we talked about model-- and this applies to Revit. It applies to any potential authoring tool like ArchiCAD, Vectorworks, et cetera. As long as you have a BIM model and it has parametric database, then you have the potential of doing what I'm showing you here. I'll focus on the one authoring tool, which is Revit.

      But the concept applies to other tools, including the IFC's neutral format for exporting. So it contains a lot of data, so it's about the data. So one of the most important things you have to do when it comes to LEEDing with innovation is prepare the data within the BIM model in order to be easily found and treated once you want to start using it for producing whatever is required for the LEED submittal, or if you want to automate a lot of the processes, like using computational design, or if you tie it together with other tools that are emerging or have been in the market recently that use some level of AI or they're very automated in terms of enhancing existing processes.

      But once the strategies start out, you can start looking into using a separate tool outside commercially or looking into something internal, like a scripting and automation, things that are going to enable you to achieve these particular goals that you identified. But it all starts with a plan. So you have to-- to better understand, many of the LEED credits rely on calculations to determine if the LEED credit threshold has been met.

      So as the project is modified and revised, the credit threshold calculations needs to be recalculated, which is very, very helpful to have all these things in a database repository, like Revit, where LEED credit points may be achieved-- sorry, where you can use the model to monitor and track information that is already preset and define as you're progressing the design, if you're starting early on. But again, the calculations, again, happen within the Revit model.

      You can effectively track and monitor a lot of credits, like I mentioned. By default, you can at least achieve certification level, which is 40 points, just based on the things that you can do out of the box, configuring, of course, a Revit template, and, of course, setting up a few things that I'm going to share next. But you can effectively track and monitor a lot of credits that I'll explain, those related to materials, energy, water efficiency, indoor environmental quality, and others that apply within.

      But what you need to start is, if you look-- and again, this is for Revit. But if you want to start leveraging the model more efficiently for this kind of process, then you need to set up a template for LEED. That template is going to have a lot of preconfigured schedules and custom family parameters.

      Family parameters are going to be all the content that is going to be used for water tracking for like plumbing fixtures or on the outdoor side, maybe the type of plants and landscape, for materials, the kind of walls and doors or anything that-- or even structural elements, like framing, where you're going to have parameters that are going to be needed. So that's where shared parameters that can be applied to multiple projects and better control and schedule come into place.

      So you can automate a lot of the tracking of relevant data for LEED certification. And like I said, a lot of it is based on formulas and calculations. And also meeting thresholds, so the schedules can summarize data specific to-- for specific credits, provide insights to whether a project meets LEED threshold for energy, water use, material sourcing, and more. But so in the end, it does allow for more efficient management of LEED documentation and more of a real-time tracking of sustainability goals, especially when you have design options, which is a feature within Revit that allows you to compare scenarios.

      So if you leverage that feature within the design environment and you have all these presets, including guidelines, nomenclature-- and of course, this is just focusing on just the Revit side-- you can still use that model further for other kinds of simulations and analysis. And I'll mention some tools that are pretty popular or common for that process. And I always advise to have that BIM guide, the BIM plus LEED Guidelines . And Standard Operating Procedures, especially because they're going to be always evolving. And they're always going to be optimized, and things are going to change in terms of technology.

      And then that is part 1. Part 2 will be the analysis and simulation software, which I'll explain that in a little bit. So as I mentioned, you need to schedules, which the class is not about how to set up schedules. There's other classes that AU had all the specific topics, like design options or schedules, working with parameters, and so on.

      But this is a combined formula for you to have, a template that has all the required configuration for you to start leveraging the model data as you're placing it. There is QA/QC tools because one thing is setting it up. The other is enforcing it.

      So once you start using this process, the challenge is how do you enforce it and prevent other people from doing it differently in terms of they downloaded content from a website, from a manufacturer and it has-- it doesn't have the parameters. It doesn't have the information. Somebody starts using a different template or copy pasting from a project to another one.

      So there's view filters. There are view settings. There's shared parameters. There's all these different components that need to be set up and applied on any given project.

      So my advice is there are different tools already in the market for QA/QC, Solibri. There's Indicara. There's a number of them that can allow for QA/QC checks for standards. Revit actually has a tool for model checks that audits the model against standards.

      So you can have one standard. And then it does it to a certain extent, but it's a free tool. And you can use it. I think it's called Revit Model Checker. So that said, I'm going to talk about a few examples and use cases.

      Like I said, there's a lot of different credits that you can track and monitor. One of them is the outdoor water use reduction, the indoor tool that I'll talk about. But in the end, like I said, start preparing the data within the mid model. This particular credit has achieved a 30% reduction in outdoor water use from a calculated baseline and one that is design proposed one.

      So in the end, you can earn up to 2 points for further reduction in potable water use for irrigation, if you can offset that. 50% reduction achieves 1 point. 100% reduction, it means no potable water use or irrigation system install, achieves the maximum.

      But it's all about take offs and the data that is in the schedules that is extracted from the model. So it's like garbage in, garbage out. So if you have already preconfigured the right information and parameters in the data within, then you can see that you can track those formulas in compliance for if you use a certain landscape type and it has all the information about how much water you use and you change the design option and propose something that instead of-- is regional and doesn't require water-- for instance, I'm based in California, so there's a-- lot of times there's a drought period.

      So using local vegetation definitely helps achieve this particular credit, especially when you start looking at landscape design options. So it's all a database. So in the end, you can do comparisons with design option A, the design option B, and you can have as many design options within the Revit model that you're producing and then have those what-if scenarios comparing side by side and print it on a sheet.

      So basically, you can see, what does it look like if I go with this particular plantscape or design layout? Which, again, you can do-- I don't know if you saw-- going back a second here, on the top right, you'll see that there are topo surfaces.

      But basically you can extract areas. If you don't want to be more specific with counting or planting, then you can have square footage specifically determining areas that you qualify as a type of shrub or vegetation that doesn't require any water or require less water. So it's a very simple formula, which is you have the schedule. You have the parameters within the families for this component.

      Then that's already populated in the model. That information is shown in the schedule. And that it may have some key schedules that are sort of prepopulated with the parameters that are based on certain plant type, for example. And then you can have that formula built in in the schedule giving you that compliance, like does it meet-- is it achieving 50% or not? So if it does, then you qualify for the point or go beyond.

      So same thing, outdoor water uses-- this particular credit for outdoor water use reduction can be achieved through different strategies. One of them-- besides the planting options in the native adaptive plants, adaptive plants, of course, better irrigation system, but one is the greywater or rain harvesting. So for rainwater harvesting, basically all you need is a rainwater schedule-- sorry, a roof schedule, and then that roof schedule is going to do the math.

      So it automatically calculates what is the rainfall data that is already entered, giving you information to size and the collection system or what is going to be the type of offset that you're going to have for like flushing toilets. So you can use that preset calculation just to give you a quick insights or faster insights.

      And then looking at the indoor side, so basically the most recent version of LEED 4.1 has the category of offering points for indoor water reduction, which is basically reducing potable water consumption inside the building by using more efficient fixtures, low-flow faucets, toilets, and other water-saving type of strategies.

      But in the end, the credit is achieved if you, for example, for building sewage, if conveyance by up to 80% or if you hit a threshold in terms of going with one particular type of fixture versus another one and do the same exact thing that I just showing you for the outdoor water reduction side. So you set up a model with all the fixtures that already have the information. You have the plumbing schedule, for example, or sink or whichever the fixtures are going to be used.

      Then you have information like gallons per flush for the toilets' flow rate, if they are greywater or not, if it's non potable, the daily use. And all these different things are going to be preconfigured. And based on the type of fixtures that you have, then you can use that information to simply give you this particular outline, which is what I just said.

      You can achieve up to 12 points by if you-- depending on the reduction, if you do 25% is 1 point, but it goes a point as you increase 30, 40, 50, 60, up to 80%. If you hit 80% reduction, you get the maximum of 12 points. And just between these two, we're already getting close to half the points that you need for getting certified, if you think about it.

      So again, all of this is water-conserving fixtures that you can just propose a default, which is already noted on the LEED certification guidelines and then what are you proposing as a more efficient design in this particular area. Now, one thing that I'll talk about in a minute, in a few, it's leveraging a little bit of automation. So you can expect similar results by using visual programming, visual scripting with tools like Dynamo.

      So the results are very similar. You need the same data. So in the end, what happens is you need that LEED table input, complete the formula so that it can be implemented. So in the end, you can achieve the results, just basically the compliance or not, by having Dynamo basically prepopulating and giving you the data that you need to visualize.

      Again, the schedule is a more traditional way. It doesn't require Scripting but you can also develop tools that can help you-- or somebody that is not that really savvy, just enter it in a more dynamic environment. Or even go further, and create this into a plugin.

      So you can turn a Dynamo script into an API tool-- I'm sorry-- into an add-in with the user interface. And that could even provide even a more streamlined process of entering data. And then just like looking at reports, like Power BI, which I'll talk about data visualization in a minute, but just not to focus on water, material reuse can give a very good amount of credit points.

      So just the one that is the most simple to probably track is the material reuse. There's a feature within Revit is called phasing. You can kind of fake that and devise a different workflow for visualizing existing condition and new construction. So the concept is the recent guideline for version 4.1, material reuse is option 3, which is under the building material reuse, yeah, materials and resources.

      And basically you generate different schedules, apply phasing filters to assess how many walls, doors, or any other elements that can be used, including structural framing, and basically quantify what's existing, what's proposed in terms of new. So like, for example, if you reuse existing structural wood framing, which is very commonly used in multifamily or a lot of residential properties, you can contribute points under this credit. Basically the structural wood framing would be part of the surface area being reused.

      And for example, if you're reusing framing from an existing building and it contributes to the required percentage of reused surface, 25%, 50%, 75%, then it qualifies for points. Like if you hit 50 is 3 points. If you use 75, it's 5 points. So it's the same concept.

      And I can go on for another hour just talking about what use cases you can use the Revit models specifically for tracking and monitoring credits. Here's a list of just others that we've actually developed a template, and you have specific preset formulas. For example, you have the building product disclosure and optimization sourcing of raw materials. You have parameters for materials that are responsibly sourced.

      And sometimes all of this information is a simple check, yes or no. Is this local? Or is this within 500 miles? Yes or no.

      So then it helps you filter and organize the information fast and then be able to produce the data that you want to submit online. And this is without talking about data visualization yet, simply just visualizing the data right within the Revit environment. Again, there's others like heat island reduction.

      Some of them are pretty creative, but they're all similar in terms of the context of using the data that is populated within the content like families. And then those can be later on classified and filtered. And you have dozens of schedules that are specific to LEED credits.

      So the thing I'm going to talk about now is visualizing the data faster or more efficiently. So there's different ways you can extract the data. You have Excel exporters. There's a number of them commercially available, Excel. You then can manipulate the information and visualize it with Power Bi. There's SQL export. There's different methods.

      But one that we've tried very well that provides a lot of control and flexibility but requires programming, it's like an SQL database, where you have a data extractor that can manage multiple models, especially for larger companies that have more than 100 models throughout different locations. Then this would be pretty effective.

      I've seen a lot of companies do a Frankenstein solution approach, where they hire an intern. They enter, develop the scripts, and then that script connects to the database. And then that database is connected to the Power BI template. And then it provides information that is useful for the manager or whoever is using it for, let's say, LEED purposes.

      The challenge is the person that built it leaves. And then you're in a really bad position, and then you hire somebody like us to come in and fix it or redo it, which is what we've done in the past. But the idea is that you have a number of options. And some of them are low cost. Some of them are free.

      The learning curve varies. Power BI is one of the most popular, but there's others like Tableau. And then you can create other custom visualizers. But it's very helpful for extracting the data, seeing the model-- out of the box, you don't see it within Power BI, the model. But you could create a Power BI dashboard with the APS Viewer, or formerly known as Forge.

      So you can have information dynamically shown with charts, graphs, and then basically showcase like the schedule that I was telling you. But then at the same time, you can compare side by side with dynamic data, like a graph chart or the model, and then be able to select and filter and see the information as you need it. So it's a more effective way to make decisions.

      But it's not the only one. So you can go further. We've done projects where we've created custom viewers and dashboards that allow for very thorough comparison of models for cost estimating, for material take off, for quantity take off, where you can see what is the difference between model B and then tell me exactly how much concrete or steel or wood I had on this project, on that floor, on that level, on that area. So it definitely helps accelerate the type of data or at least the information that you need to see.

      And we haven't gotten to AI yet. So for now, we're progressing from the defaults "out of the box" Revit model database, then progressing into using other tools like Power BI to visualize the data. But then optimization for LEED, then this is where you can apply visual scripting, like Dynamo or Grasshopper. If you start early on, Rhino is a very common tool used by architects for various reasons but mostly because it's very dynamic, very easy to-- model shades are more organic or less restrained.

      And then it has Grasshopper, which is equivalent to Dynamo in terms of programming, visual scripting. But so what happens with the Dynamo script is you can pull data from a model. In this case, what you're seeing here is a Dynamo script that pulls data from the Revit model to update a procurement sheet. That procurement sheet is tied to materials.

      So basically the idea is that you can have a way to dynamically update the sheet changes based on a simulated project timeline. So you can have-- change the materials specifications and then automatically update a sheet that updates the material take off, the type of materials based on a certain area or a certain part of the building, and also a sequence of where things are going to be commissioned or where things are going to be built or even just that sequencing of events.

      So the idea is to reduce the risk of over-ordering, under-ordering materials, or just be a more-- have a more accurate budget-- improve budget accuracy but also ensure the timely material delivery is matching the progress.

      Now, again, how complex can you get? I've seen very crazy workflows that use a combination of Grasshopper, Dynamo, of course, some custom programming with c-sharp and other tools with a Revit API that creates a plugin. And the idea is to use the model information to leverage the data to actually check for, like I said, the design scenarios.

      So without even doing anything on the manual side that I said-- I've shown but simply happening in the background so that, in the end, all you get is a yes or no, it complies or is not compliant, or how much is it is to hit the compliance.

      So a lot of it is proprietary that I can show. But definitely there is interesting cases where people have developed their own more streamlined process to visualize compliance, yes or no. And again, it's applicable to LEED but other green building rating systems.

      Now, some things that are unfortunate that were really cool and available at a time was-- well, first of all, LEED Online is a platform portal where you submit the information. Now, it has an API, which is beautiful. So what it means is that any developer or anybody that has a skill set can actually integrate and connect and create tools that streamline a lot of these things.

      So what it means is it allows integration with third-party tools and platforms to streamline the LEED certification process. So that API provides different access to project data, credit information, submission status, et cetera, which is very useful for developers or project managers that are looking to automate a lot of the documentation submittal and the workflows.

      So to use the LEED API, the LEED Online API, you have to request permission and an API key. So that's not-- that's usually not a problem. So you can request it via the USGBC. Depending on the usage, there may be different endpoints available for project data, document submissions, and review status.

      So usually the APIs that are available are for project management, for credit and prerequisite management, for documentation, and for certification status. So technically you could build this kind of tool that Autodesk used to have in the past. It's called the Credit Manager for LEED.

      It, unfortunately, is discontinued. It was initially designed to streamline the process of managing and documenting LEED credits to an extent but within the Revit model. So however, Autodesk eventually discontinued it, and it's no longer supported or is no longer available as a standalone tool.

      What we've heard is limited adoption, shift from Autodesk to more comprehensive sustainability management platforms, like insight format, and so on. But it opens the door knowing that you can, if it's something that you-- your firm consistently does, whether it's regional, like LEED certification or global with other solutions, then you can potentially streamline and automate a lot of the processes.

      And technically, because we've tested it, you could leverage the Construction Cloud or BIM 360 with their API. And rather than rely on just the local Revit file, you could have information driven or downloaded extracted from models hosted on the cloud, so on a repository information or document management system like the Construction Cloud that has a combination.

      How do we know? There's a level of APIs that are available. But well, in theory and in practice on some things that I cannot show, you can have ecosystems of services for each credit or for each-- like a microservice that will be specific to LEED, for example.

      It could be a separate entity that you just leverage the document and management side of Construction Cloud or the field side or the project control side. So it depends on the information that you need. But in the end, this is just a highlight on the possible execution of a project.

      So to wrap up that particular topic. So there's, unfortunately-- and well, fortunately and unfortunately, so at least in the US, LEED Online has public API available for project and credit management. So that's the positive news.

      WELL has also-- which is another WELL-adopted rating, also has APIs-- but sorry, doesn't have an API, but it allows for possible custom integration. So again, some of them have to different extents.

      So if you're managing large projects or integrating multiple certification, it's always recommended to engage directly with the certification body for the potential custom integration and then see if a third-party solution can be developed internally or even just validate the value proposition of creating one through a third-party software development company or consultant.

      That's the kind of thing that we do, for example. But again, it's something that you have to weigh the pros and cons in terms of volume of work and the value. But one great area that a lot of people don't tap into is the existing conditions, existing buildings. That is another segment of the LEED certification.

      This is where not just the existing condition certification side but also the existing building-- sorry-- but the operation and maintenance side that is going to be more relevant has LEED evolved because you had to be able to adapt to how the design and the building is going to react when there's future problems with water, how to manage that or a future weather conditions that are not yet considered, more hurricanes, more earthquakes, anything related to climate change, really.

      So digital twins, the first thing you start off with digitizing the assets. So the beauty of it is existing condition geometry capture is evolving fast, over the past few years sort of hit a stagnant point. And there were always the same players. And luckily, with the advantage ChatGPT and all this AI investment, there's a lot of new technology or processes that have been implemented in the mix that scanners now understand what or will understand what they're scanning and then identify, classify the elements.

      NVIDIA has, for example, AI models that are going to disrupt photogrammetry, where basically you go from single or limited image, 2D image, to a 3D point cloud in minutes. And then the conversion of the models that you have, anything that you scan, then you have to develop it into a model that can be used. So the point cloud will take hours to clean and then to convert into a model that is usable downstream for design or for any other purposes.

      So it's leveraging the emerging technologies that are helping reduce the time and the labor to achieve-- or automate the process to actually produce a model that can be used now for leveraging data to make better decisions for that LEED process certification. So it starts with the objectives, which means you have the point cloud. What do you need to-- what do you need to track and monitor?

      So this is where people-- it's a gray area. People, they're usually black and white. They have requirements, like [INAUDIBLE], where they have to be LOD 500. And then they have to have all this asset management information when, in reality, they may not really even need it.

      So it's about determining the right strategy per project or per initiative and, in the end, be able to have guidelines to implement those and enforcing them and checking them. So it's about devising the right strategy. You can not just cookie cutter paste-- copy, paste from project to project because you're going to end up spending a lot of money. And one of the biggest drivers-- or sorry, biggest roadblocks on digitizing assets is the cost for scanning, developing the model to an extent that is usable for design purposes but then into construction and operations.

      So talking about operations, then you have the IoT, which now provides a layer of visibility of what's happening in real time and provides that dynamic network of seeing what's happening on the electrical side, mechanical, all the systems and then providing the context within the IoT to see where things are.

      So this is going to be impactful for operational carbon, where there's stronger focus for more energy-efficient buildings, operations, and then ongoing performance of continuously monitoring water, continuously-- having better data transparency, and, of course, using emerging tools that qualify for the AI-- sorry, the innovation credits that will encourage for the building to leverage smart technology to control, to monitor, to predict.

      So what we recommend is to take a Fitbit approach. Most people know what Fitbit is. It's a company that has a smartwatch that monitors and tracks health data or at least your body heart rate and so on.

      In a similar fashion, there have been technologies that are emerging-- there's always the ones that are in the market that own market share for like Johnson Controls, Siemens. There's a plethora of them.

      But then there's the ones that are startups that are coming up with solutions that are plug and play. They're easy to install. They don't require complexity in terms of specialized skill set to install them. And then they just-- you're operational very quick in terms of, I want to monitor a particular piece of equipment, like an air conditioning, and then it monitors vibration frequency and a bunch of other physical properties.

      And then be able to onboard that with their own machine learning algorithms that create the threshold of what is normal and what is outside that threshold that is an anomaly, where basically that's where the prediction level comes. If anything starts acting outside of that threshold, then you start getting alerts or you start getting notifications as it learns over time.

      There's others for like air quality, for anything you can think of for indoor environmental quality-- lighting levels, temperature, humidity, air quality, which, again, it can get as deep as detecting COVID in the air, like pathogens. So what happens is there's a million options [INAUDIBLE] a million options in the market.

      So what we always recommend is break it down. Digital twin data viewer, for instance, where you want to see and combine information from IoT, then what's important for you or to the company that is going to be beneficial. Break that functionality, and then do a little research. And then you can hire an intern to do this for you.

      I'm just kidding. You don't have to hire the intern. But I would recommend having somebody that has enough visibility and knowledge over what to look for. Otherwise, they're going to go down the rabbit hole and just spend hours and hours on this.

      So the idea is to see what's out in the market that fits the criteria of what you're trying to monitor, whether it's water efficiency, meter information for energy, lighting levels, and so air quality, and so on. And then once you identify the one that does it all, which is very rare to find one, then you can make the argument to see, does it make sense to actually buy licenses and then worry about maintenance plans and so on? Or is there a case to be made to develop something custom?

      So if you go the commercial route, it's about determining tier-level approaches. So what it means is information that is going to be needed, like on the BIM side, you just need somebody that can access that information anytime, anywhere so they can make better decisions and address issues faster. So that way, when it comes to operational efficiency, which is an important side of that component for existing buildings, then it helps with that ongoing monitoring.

      So then if you need that visibility of what's happening on the equipment side or any other areas that is based on data collection from devices, whether it's air, lighting levels, et cetera, or even space pattern usage, space usage pattern, so you can detect-- you can determine which one is the tool that fits the profile. And then based on what is the problem that you're trying to solve, that really should be the driver.

      What problem-- what would be the credits, in this case, that you're trying to monitor that would be very valuable for the project for whatever purpose? And then the cost of digital twin platform method, which is the same approach for anything that is custom made, is basically, if there's nothing commercially available, then explore the option of potentially looking at the cost and the investment and the maintenance plan for what it would look like to develop a tool that would be beneficial for a client, especially if your client has a lot of properties. Definitely it makes sense to evaluate that proposition.

      And then one of the important things is building performance monitoring, which is going to be more and more intense as buildings are the ones that produce the most carbon in the world. So 40% of the carbon is produced by buildings.

      So anything that can be done at whatever scale, especially when the properties are-- when you have an owner that has not just a few properties but like hundreds of thousands of properties, like an Amazon, or a tech company that is developing data centers like crazy, all of this is going to be very, very impactful because anywhere where you can save 5% and you multiply that by 300 buildings is going to be a significant offset.

      So in terms of building performance monitoring, you can-- that has a tied into LEED innovation credits and operations and maintenance, where you can pursue the credit for advanced operational strategies, like [INAUDIBLE] energy or water monitoring, air quality, where-- funny story, during COVID, we evaluated a lot of technologies that had capabilities to detect COVID in the air. So there are, but they're expensive.

      So in the air, what you care about is the air quality or if there's issues with ventilation. Really, that's what it comes down to because, if you have ventilation working properly, then the air is filtered and flowing, et cetera.

      The problem is when you have stagnant air. But I don't know if you know, but if you buy a sensor for CO2, carbon monoxide on Home Depot for like 40 bucks, you can actually detect problems in the air and then-- or ventilation problems. So then that can, in turn, be visualized with a custom dashboard-- I'm sorry-- a custom digital twin viewer.

      And I'll show you some examples here, where a digital twin that was customized based on specific use cases and intents were developed for these particular purposes that are related to LEED, particularly for LEED credits. One example is electrical panel management.

      So this is an airport where basically there's a pilot to monitor-- in this case, it was to even out the electrical loads. The digital twin is set up related to asset and equipment maintenance, that data. So all the applicable electrical uses to monitor panel boxes and provide alerts, notifications for failures to improve maintenance scheduling. So that was a big goal, to be able to see what's tied and connected and visualize that in a very interactive way.

      So this was built on Unreal Engine, which is one of the game engines. It could be done in Unity, which is another popular engine. And they're both free. And then there's the Forge Viewer, which is another way that we've seen it for people that want to use the Autodesk Platform Services.

      But one case that was noted on the learning objectives is an interesting use case for measuring lighting levels, which is a LEED credit option for indoor environmental quality. What this-- what was done here was the airport created a lighting control that was dynamically-- it was dynamic sunlight.

      So basically, they did a simulation for daylighting, where they perform a year-long projection with specific building envelope and location in mind, utilizing the sensors set up for different lighting controls for the zones to adjust the lighting dimming based on daylight levels in 36 different ranges while maintaining the same lux level on the inside. So what's really cool is basically the weather changes outside, and it's cloudy. And then the lighting adjusts, but you don't notice anything from the inside.

      So that simulation projected a 15% to 20% savings in annual costs. And then what's happening is actual sensors now are capturing the real-world data, and this is being currently measured and benchmarked using the digital twin. So beyond that, there's the predictive case for enhanced indoor quality for like HVAC systems and other things that monitor weather conditions.

      This is one example, was basically a case to monitor performance but also other conditions that were specific to the client, where the typical visualization site for IoT is like this. You have a dashboard that provides a graph of what's happening over time during a certain period. And then if the company is more sophisticated, they may have a floorplan like what you're seeing here.

      The idea is, without the model, IoT is just data. So it's very-- and some of the data may be very hard to interpret. So using a digital twin can enhance the way that you can visualize and process that data.

      So this is one example of using a combination of technologies in terms of using the Unity web viewer, Forge, and the Construction Cloud to, in the end, have an application that was mobilely accessible based on the KPIs from the client, where they want to see if the building is safe, healthy, and performing well. And they have their own specific, like I said, metrics that they wanted to measure that were really important for them, like the rentable space or CO2 level or lighting levels and so on.

      But this particular process, besides being able to use the models for real-time simulation-- sorry, for simulations of what-if scenarios, you can use it for simulations of real-time performance or metrics over time, so like workplace optimization, motion pattern tracking. The post-occupancy evaluation for architects is very manual today. It's all a survey that is manually filled out.

      Imagine if you have sensors in the room or in a space, where you can monitor if the design met the intent and then how the space is being utilized but then going beyond and tracking temperature-level patterns and CO2 and lighting levels to see if they're actually meeting the proposed air quality and also comfort level that were noted for the particular LEED credits that were in the mix. And then I didn't touch much on the sustainable sites, which, again, is more of an urban planning component.

      So in that regards, the sustainable sites, the same concept of digital twins can expand into a larger scale for city planning, management, and city code consolidation, and so on. But the idea is that it's the same idea of leveraging data from different sources to check transportation locations or accessibility, check for any data that is related to transportation, municipal energy, underground space, and a number of other data that can be extracted and manipulated.

      There's a number of tools like S3, Beam Engine. There's ArcGIS as its own. There's a lot of different tools that exist, but you can also, in a similar fashion, create a custom game engine-based digital twin platform that is for larger-scale projects.

      Now, regarding analysis tools, what we'll talk about next is-- well, first of all, disclaimer, I don't endorse any of the tools that I'm going to show. It's now kind of jumping a step and just using tools that are off the bat, should automate processes that are applicable to LEED-related information submittals.

      So here is an example. On the left is basically do-it-yourself approach, where you have visual scripting commercial and open-source tools that basically you have all these workflows that provide visibility for energy, performance. There's the Grasshopper plug-ins, like Honeybee, and a number of others that provide specific things related to sustainability.

      And then there's the ones that are commercial that are usually license based, and some of them have an AI component. Some of them have been around for a while. Some of them are emerging tools.

      So the challenge is there's a million tools, and there's a gold rush for AI in the past year since the ChatGPT commotion. So it's very hard to differentiate smoke and mirrors from the reality to see if anything is applicable. So it's a mix, and it's something that we can advise on and share some additional information. So I'll leave my contact info if you have any questions.

      But just kind of run through some of these tools that actually have relevance or can help with validation or LEED points, starting with Autodesk Insight, Insight on itself is not-- it cannot be used to validate LEED. But the analysis results from Insight-- images, charts, totals, et cetera-- can be used to validate your LEED submittal. Some architects actually use the images in charge from Insights as part of their submissions.

      Forma, it's another tool that is well suited for aiding with the LEED certification process. It has credits that are for site selection, environmental performance, energy efficiency. Most people associate it with urban planning or if you have a lot and you want to maximize layout and have a more generative design scenarios, that's what they're associated with most.

      But it has a lot of sustainability features like, building performance in terms of energy, daylighting, wind, and a number of other things that are related to carbon data, like providing directional views about embodied carbon choices for major building systems, like building construction type, facade systems, and so on.

      And it actually has specific credits that it can target, sustainable sites, like heat island reduction, energy, and atmosphere, like optimized performance, renewable energy production, indoor environmental quality, like daylight thermal comfort, even water efficiency.

      So it's got a lot of features. It's just understanding what is really necessary, that would bring the most value. There's tools that are for material embodied carbon. There's one that is called One Click LCA, specialized in life-cycle assessments. It actually provides the report that is needed for LEED certification.

      So it looks like this. So basically it specializes in that. And it actually is applicable to multiple rating system, not just LEED. But it's very valuable for several credits, like material and resources, energy and atmosphere categories, and other performance-based credits.

      There's other tools that are newer in the market, like cove.tool. It's a platform that supports LEED certification. It's very robust, related to sustainability optimization capabilities. It targets similar credits as the one that I just mentioned, Forma, but the idea is that, what is the problem that you're trying to solve?

      Like the matrix that I showed, does the tool does-- the tool do what you need in terms of the type of information that you're going to be requiring for submittal and for compliance? So in the end, all of these tools, what they do is they automate the process, especially on the design side, where you're now making scenario decision making-- sorry, scenarios with different potential outputs.

      So then they all can provide some level of quick validation of, if I go this route versus another route, what would be the impact? Which, again, that's what you want to do early on, not wait until the end to apply things that don't really matter at that point or are more costly to implement.

      So another tool, just to wrap up that particular segment, is the Digital Blue Foam, which is similar to cove.tool, is good for early design stage, credits that relate to site analysis, building performance. It allows architects to test scenarios, provides tools for analyzing certain LEED credits, like sustainable sites, protect or restore habitat, open space, even daylighting, a number of others.

      But like I said, they're all very-- they each have their limitations, so I would go with the mindset that test and validate and see if there's a value that is achieved through automating some of these tasks. Now I'll jump into AI and spend a little bit of time discussing, really, the potential of streamlining further what the-- the process of validating information is so time consuming and finding information.

      So you could-- there are-- very limited, but there are AI GPTs that are for certification purposes or for LEED or for very specific for sustainability areas. So they're pretrained transformers, basically like a ChatGPT specific, train on data that is related to LEED or enhancing the process involved in obtaining LEED certification. So the tools leverage AI to process, analyze, and generate information related to sustainability, building practices, energy efficiency, and so on.

      Now, we've been exploring and developing an experimental tool called Andiamo. It's a chatbot that started off as a Amplify [INAUDIBLE] that was trained on [INAUDIBLE] practices and then could read documentation from different formats, multiple-word PDFs, et cetera, and then interpret the information in a more specific way, and then also leverage information from the model, and interact with the user for democratizing tasks.

      So the current process-- and anybody can create this. It's just very time consuming, and it requires resources, specialized resources, and effort. The one that I just mentioned, Andiamo, actually creates a local JSON file from which the GPT actually interacts and learns but also provides input. So combining the model data with the documentation data is very powerful.

      Once you then look into machine learning models to identify patterns in the data, that automatically can help with prediction things. So this is just an example of the simple interface. Again, it's an outdated one. But the idea is that you can upload documentation, ask about anything, and then be able to refer and infer information from the model and from the document.

      So some examples, again, it provides a lot of different use cases. So these are some of the things that we've been exploring, like embodied carbon, like the ability to ask, calculate the current carbon footprint of the project based on A, B, or C, and then providing you information by reading the model and then potentially leveraging the document as well, or being able to quickly say, what if I ask information about material-related inquiries?

      It learns from the-- making sense of tons of information that otherwise you have to browse and search manually. So definitely an example was asking the assistant for stucco type-- what kind of stucco type is this? And then identifying the right inference and then the abbreviation for it and so on.

      So there's this number of things that are applicable that can be performed right now for object quantification, material quantification, for being able to dynamically ask questions and then look at those what-if scenarios. Right now, the way it's set up is customizable. So it means that there's development hours that can be achieved-- can be implemented for achieving a particular thing that I'm showing here on the screen.

      But the goal here is anybody can build their own GPTs. Like I said, it's just very time consuming and expensive, especially if you don't have the resources. So that's why going with a commercial route might be a better decision.

      Or it just depends on the intent, like how much recycled material do I have in the building? How much reusable material do I have for structural framing on level 1? So all of this is dynamic.

      Instead of having tables, having a GPT that reads model information is powerful, especially to make decisions that then you can compare with documentation. That would be the next level. So challenges in the adoption of AI-- and you probably heard a lot, like privacy, control of information, ensuring that you own the data for IP, and so on.

      The digital twin has its own challenges as well. So combined, there are a number of things you have to consider to overcome them. So if you use digital twin platform, what is the licensing cost? If you do a custom, what are the streaming costs? Can you go with desktop versions versus mobile? Infrastructure needs for IoT sensors, data management, data mapping, even cultural implications.

      Are people that are in South America, are they going to use this iPad to monitor information that is going to help them make better decisions or enhance the maintenance process or operation process? Or is it a gradual adoption, where you-- you have to have a strategy.

      And then what is the AI-solution approach? Is it in-house versus commercial? And so on. So a lot of it is a data, how you collect it, the what, the how, the when, the who, and where, responsibility-- where is it going to be?

      How is it going to be collected? Where is it going to reside? And so on. Then the combined efforts of if you don't have commitment from executive team, definitely any initiative that you have is going to not work.

      Then you look at the piecemeal strategy of people doing a combination of in-house and outsourced the wrong way, which means that they are not really-- they don't have a holistic plan for how are they going to be leveraging the digital twin for what intents and to what extent, the benchmarking processes, who is going to maintain the model.

      Then if you do have these data issues, then what are the consequences if you don't have the right tools, the lack of responsibility, if you don't have requirements, which happens a lot, and then any processes?

      So in the end, there's a lot that can go wrong if you try to do yourself and you don't have the experience. So definitely is the cases that we've seen is either they hire and build a team in house, or they partner up with a company that does consulting and can allow them to achieve certain levels of implementations based on what I just showed you here.

      I'll wrap up with the following, which is some of the experimentation that we've been doing. And some of them is driven by clients. Some of them is driven by internal efforts.

      An example that I just mentioned is the ability to read documentation for compliance checks. There's one way, which I show, the JSON file. There's different ways that you can have model data [INAUDIBLE] with the user but then be able to tie that into-- we built a little prototype for an inspector, which basically could be set up in a way that it reads the documentation, breaks it down into categories, chapters. And then you can use it for model compliance checks.

      Right now, again, is something that is an internal tool that just proves the concept that you can probably-- well, no, that you can certainly use the model data and do compliance checks with information that gets exported based on documentation. So it can check against ADA compliance, or it can check-- in this case, for LEED, does it have information about gallons per flush, if it's a water efficiency credit? And then be able to extract that, export it to Excel, and then, from there, use the GPT to actually ask the question.

      So there's different scenarios. One that we're excited about, which there's a lot of focus on the generative AI for design renderings. So what it means is you're able to generate pretty pictures and then modify and provide design scenarios graphically. But beyond the graphics, can you actually, from the image, extract information about what's in the picture? So you can get data about quantities, about the type of elements that are there so it can tell you from a photo or from a video, what is the condition like?

      Are there-- does it identify, for example-- I'll just show you one example here-- the segmentation of the image? What it means is you can find patterns-- well, first, you can identify what something is in the picture.

      So today, there are AI models, which we already implemented and have working, and anybody can actually try themselves. But you have models that can segment images and then detect what's what in the picture. Now, beyond that is, besides what's that, can you tie that into a database so it knows how much wall do I see here, just based on the potential scale that it's interpreting?

      That dimensional content or dimensional context is going to become better and better. So eventually, you'll take a picture, and you'll say there's going to be there x amount of information, x amount of material, or x amount of-- you have 50 elements of a certain type.

      And then in this case, one thing that we were just simulating was, if you were to identify the objects, can you identify the type of-- for example, cost database, so like Pinterest, it identifies a cost. And then it tells me, based on what you're looking at, you're looking at a $500 kitchen, $5,000 kitchen, or a particular type of information related to sustainability.

      Based on what I'm identifying in the picture, I'm seeing x level of LEED compliance, for example. So just to convey the example with images, this is what I'm looking at is, is you can identify elements today right now. It can tell you that there's a carpet on the floor, that that's a bed and that's a wood floor.

      But then can you control or provide insights from that, maybe connected to a website, which you can, and then also for maybe an actual database, like an IKEA catalog, that knows the kind of wood that was on a table? So then the idea is, in theory, you can actually have controllers that can regulate the generation of those images and then relate it to the LEED certification or, at least related to sustainability, provide that insight that is going to help you make better decisions.

      So we're now going past a Revit model, BIM model, then the automation tool scripting, and some of the commercial tools that help with the simulation, and then the AI, and then within AI now, not just the chatbot information extracted from the model or from documents but now images. So once you have this what they call multimodal models, then you can combine images with models, with data, with anything.

      So in the end, the future ahead is really exciting in terms of what is the potential. And I've seen some really interesting pilots and initiatives that are being performed in different [INAUDIBLE] firms. But hopefully this gives you a good insight on all the different methods, what can be possible-- what's possible and practical, what tools could potentially help in achieving LEEDing with innovation, which is really what the topic of the conversation today was.

      So I want to thank everybody for taking the time. I appreciate you listening to the session, and hopefully you like it. And if do like it, recommend it. And you can add a comment on the class page. Thank you so much, and I'll see you in San Diego.