Description
Key Learnings
- recognize differences between BIM deliverables, information standards, asset data obligations and open BIM
- implement a planning approach to adopt the right tools and processes for a given project
- understand how a platform approach can offer long term flexibility for changing demands
- challenge poorly crafted contractual requirements and offer recommendations on remedies
Speaker
MAREK SUCHOCKI: Welcome to my class today, entitled Demystifying Technology and Standard Selection for Infrastructure Projects. My name is Marek Suchocki. I'm a sales development executive at Autodesk.
I am actually a chartered civil engineer involved in numerous working groups, specifically in the UK, such as the Institution of Civil Engineering Surveyors. I'm also involved with the UK BIM Alliance. I've been a member of the British Standards Committee that's offered the UK government BIM guidance, and now the ISO 19650 standards. And I also represent British standards that send the European standards committee, for work on common data environments and BIM for infrastructure. And finally, I'm a member of the buildingSMART International InfraRoom Steering Committee.
I'm going to kick off by asking a question about, what are clients in the infrastructure space demanding today? And I'm going to start by really considering the asset data lifecycle. So here's an illustration of how data specs and delivery builds up over time.
So from the outset, we need to define the data we require. And that includes capturing what we already have-- so existing asset records-- but perhaps also capturing information about the legacy assets. And also defining the data that we want moving forward. And then we can go on to actually create that data based on those specifications. And that will obviously involve creating digital models-- so BIM as we know it today, controlling the data process, ensuring that we've got the right data coming through, and hopefully collaborating, using a platform such as a common data environment.
So now we've prepared that data. We need to link it. We need to connect it, perhaps, to other data sources, such as Internet of Things, perhaps other asset data records that we know exist. And we're going to build what might become in total, like a digital twin. So we've specced our data. We've prepared our data. We've linked our data. And now we can go off and start using that data.
And that's stuff that we can use in the operation, perhaps, of assets, to maybe understand more about our assets, and ultimately, to also start defining what type of data could we have in the future. Because now we've built up a core that we can work from. Now that work isn't done by one single entity or one single organization. So different stakeholders get involved through those different phases of the lifecycle of the data.
So perhaps when it's defining that data. Then clearly the investor, the owner operator, they need to be involved. But they might seek support and advice from specialist consultants, organizations such as Autodesk or our partners. And then during the data creation, they'll appoint design consultants, contractors, subject matter experts. And obviously project managers will be engaged to ensure the quality of the data that's coming through.
And then as we connect that data again, organizations such as Autodesk might get engaged, or our partners. But then there will be others who have got specialisms, perhaps in that Internet of Things area, or maybe in artificial intelligence. But we're starting to interrogate the data to get more value out of it.
And of course during the usage phases, then we've got different actors again. We've got helpdesk, perhaps. The public, they might also want to link into our data. People doing maintenance work and so forth. So we've got a different audience getting engaged during the different phases of the lifecycle of the data.
So let's consider what type of data specifications have some of these infrastructure owners already put out there. Well, here's an extract from the asset data requirements from Network Rail in the United Kingdom. They've got tens of thousands of asset types, with different requirements per asset type, that need to be delivered against when providing new project data or updating maintenance records.
We did some work a few years back with High Speed Two, a major infrastructure project in the United Kingdom. And the project there wanted to specify future-proof data formats, so that as that project matured and ultimately delivered the new high-speed railway, that information that would have been captured was still going to be valuable, even if it was captured 10 years previously. And also, High Speed Two had a real strong commitment to using BIM. So they wanted to find formats that were BIM-friendly from the outset.
We did this study. We consulted with ourselves and with many other software vendors, to understand what type of BIM formats are they able to provide. And that led to this shortlist being identified, that actually went into the contract. So that the supply chain needs to deliver to these formats by these processes, in order to have good quality data. And some of that is model information. Some of that will include document data. But a lot of it will also be pure data. So whether it's in an XML format, perhaps IFC or JSON.
National highways-- the UK's network provider for roads. They've got an extensive Asset Data Management Manual which looks at the data requirements that must come from their supply chain. This includes the way you break down the different elements of the network, the asset record types, and even asset data handover workflow, that's illustrated here.
Environment Agency-- they manage our coastal defenses and our maintenance for the United Kingdom. They've put in a huge amount of effort over the past, nearly 10 years, looking at how can they capture better asset data, but also provide it back to their supply chain. They've got a very comprehensive asset data requirements library online, which any organization working for them can plug into and receive information, perhaps in a JSON format or CSV. But they also provide information using COBie-- construction operations building information exchange format, that the UK started specifying under our BIM program.
It's not just the UK that's doing this. So here's an example from Vayla, the Finnish transport infrastructure agency. They've been putting a large amount of effort into improving the data they receive from their projects, in order to better maintain their networks. And they came up with this extension to LandXML called InfraModel which from 2014 was known as InfraModel 3 and became a specification that their supply chain needed to deliver to. That have since moved on to InfraModel 4, but both InfraModel 3 and InfraModel 4 are obligations on the suppliers within Finland, when providing data back to Vayla.
And then my final example here is, the United States is also getting on stream here. A report came out just a couple of months ago in June '21 from the US Federal Highway Administration, talking about an investment in BIM for Infrastructure. Looking to move away from a CAD and paper-based process that was predominant within the departments of transport to this BIM-based approach, to leverage common data environments for managing information, implementing class libraries for asset data. And also recommending the use of IFC for infrastructure during data handover. And I'll speak more about that later.
So, how might you go about addressing these very complex requirements that I've illustrated? First and foremost, you need to consider planning. And I mentioned that I was part of the British Standards Committee, that offered the UK government BIM guidance, which led to the ISO 19650 series. And these first were published as parts 1 and 2, considering the concepts and principles and information management during BIM in projects, late in 2018. Last summer in August and July 2020, part 3 and part 5 were published. One looking at BIM in operations, and the other one looking at data security considerations. And there are two other standards in flight at the moment. BIM for data exchange and also BIM for health and safety, probably will get published sometime in 2022.
What does ISO 19650 look at? Well it's actually there to address those standard industry challenges, which I'm sure many of you are very familiar with. Basically looking at productivity improvement, eliminating a risk as far as possible during the delivery of projects, having better project outcomes. Does the client get what they wanted? And a lot of this was learned through the UK BIM mandate project. And the link I've shared at the bottom there. It was really amazing that on our first project we did, 20% savings were identified by Her Majesty's Treasury on a prison project. So BIM is a catalyst for change, and has demonstrated measured benefits and return on investment.
ISO 19650 also picked up some other concepts that had been developed during the UK BIM mandate process, such as the common data environment. This is a way of managing information production and usage during the project lifecycle, and obviously also into operations. And it talks about moving information from a state called work in progress, where a task team might be working, to where you can share that information amongst other task teams. And then at a point when it's deemed usable, perhaps by a following party such as a contractor, it can then get published. And these movements of data from the work in progress state through to shared and into published, are all controlled through the workflow. And that workflow involves approvals and reviews in order to ensure you have only passed stuff on that is fit for purpose.
ISO 19650 also identifies key phases and steps in the information management lifecycle. And everything starts with the owner operator. They create what's called an exchange information requirement. And this EIR details what information the client will want during the different phases of the project lifecycle, and hand back at the end of the project. The supply chain will respond with a BIM execution plan, detailing how they might deliver that exchange information requirement.
Based on those responses, a preferred supplier is appointed. And they then fully develop that BIM execution plan into a post contract bet, which itself is enhanced through a master information delivery plan, that leans heavily on task information delivery plans from each of the task teams. As you can see, all of this involves a lot of planning. Because the word "plan" is all the way through these different steps. And that planning requires a whole team focus, to understand what can we do in order to deliver that information at the right time, and obviously develop the models during the lifecycle of the project.
Within one of the clauses of ISO 19650 part 2, it really emphasizes the need to test the mobilization plan. Can we actually do what we're promising? So the clause is here, that I've highlighted. Talk about testing and testing and documenting the proposed production methods and procedures. What software are you going to use? Can you use it? Can you share information between the different testings-- all those formats there? Can I hand back data to the appointing party, the client, that they specced in the exchange information requirement? And are we able to configure and ensure that our common data environment is working correctly?
As you might appreciate, this is work done before you go out and start preparing information in anger. Test it. Can you do it? Be confident that you're not setting yourself up for a failure. Now a lot of that planning requires some basic understanding of what is possible, in order to address those challenges that were specced there in these clauses I just shared.
And a lot of that also focuses around interoperability. Can I share data between the different actors and the different software tools that I'm using? Now here are a few infrastructure interoperability considerations for you. So first and foremost, if you can do a direct integration, leveraging existing interoperability between software tools, and using standardized workflows, that is the least risk and fastest way to go about preparing model data, and providing the information that is being specified by the owners or the appointing parties.
If the standard methods don't exist, you can also script them using tools such as Autodesk Dynamo, it's possible to create customized workflows using the computational methods. It's also possible to leverage cloud-based platforms, using open application programming interfaces-- APIs. And this allows you to also bring in third party tools that do specific acts, or bring in other data sets that might be needed in order to satisfy the EIR.
And finally, it's also very important to consider open standards, particularly when the clients themselves say, look, I don't want to specify proprietary formats. But I want neutral formats that I can put that into my contract, and be confident that I'm not excluding any players within the market, or allowing me to use a very standard tool for analyzing the data receipts, and verifying that it meets my needs. So they might want to go and say, let's have open standards involved as part of the process.
And by the way, not one of these things should be done alone. These are all to be considered together. So they're not mutually exclusive. Actually you might well do more than one of these on any single project. Just some final thoughts around this approach. I've grouped these in a particular way because it's important to understand that the actual complexity of adopting these approaches is much simpler on the left-hand side of this diagram.
Direct integration is out of the box. Any model-- tool user should be able to adopt and implement them. As we move towards the use of open standards, you may need specialized skills and capabilities, and perhaps also specialized software, in order to fulfill obligations or use this approach. However, the plus side is that you get greater flexibility and universal application as we use things from the right-hand side of these options. Because the direct integration will satisfy a specific workflow, whereas open standards can actually cover a whole range.
So let's consider each of these in turn, starting with direct integration. Autodesk is really focused on ensuring that our products work closely together. And we've developed a number of workflows for different applications. And here's a subset from a bridge workflow. And I'm just going to illustrate using our InfraWorks toolkit, and Autodesk Revit.
Actually we start here, with Civil 3D, where a bridge alignment and corridor is designed, and passed across to an InfraWorks user, who has built a rich context model of the new railway line, and where the bridge will go. So they've now modeled the bridge. And we're going to be able to move over and have a look at that bridge in context, and show, yes, it is a railway line traveling over a bridge deck.
We might be able to tweak some of the parameters that came through from Civil 3D, because we're now in a more clear context understanding. So we might want to lift the deck or just tweak the profile. Hopefully, things are OK.
So if things are satisfactory, we can actually now send that bridge deck that's been modeled over into Revit. Why might we do that? Well, because Revit is fantastic at doing complex detailed modeling, and a process, such as here, placing reinforcing bars within a column header. So we're using the right tool for the right job. And we've passed data from Civil 3D, into InfraWorks, and now into Revit. And we're leveraging the power of the appropriate tool at the appropriate time.
Here's another example using InfraWorks. It's a proposed bridge in Leon in France, where the engineer, she's realized that there's some missing data that she needs to grab from underground utilities. Because she has a concern that there might be a pipeline that could intersect with the bridge piers that have been modeled for this new bridge.
So she's now brought that data in, linking directly to an Esri or GIS system. So this is a direct integration between an Autodesk technology and a third party platform. It looks like there's no pipeline intersecting. So that's great news for us, because this isn't a clash.
However, we've got a site operative who's gone out, using a mobile Esri application here, to digitize some missing records, because not everything got picked up during the original survey. So the operative's going out using this Esri mobile tool, digitizing the position of missing pipelines and manholes, and then sending that back to the ArcGIS backend system.
Now in almost a real time situation, our engineer, she's now able to refresh the data. Because she's been advised, actually there was stuff missing. And now during that refresh she's pulling the data forward. And lo and behold, yes. We do have a clash. That pipeline is going right through the foundations of our bridge pier.
You can imagine that this would have been catastrophic if it was something that was identified during the construction phase. But because we're linking available data legacy records, that were identified during that planning phase, we know something's there. It's been updated, refreshed. We're actually able to mitigate the potential clash, by moving the bridge pier and reorienting it, and avoiding a clash that could have been very costly, if it was done during the construction phase.
Now let's look at scripted integration. Here we're looking at a Civil 3D model of a highway. And in this case, the model is linking to a Dynamo application. So the Civil 3D user is saying, let's go and place some assets-- in this case, lighting columns-- at a regular interval-- a regular set offset from the highway alignment. We can actually change that spacing purely by adjusting a small parameter.
We might also want to look into a catalog-- in this case, a CSV file-- showing where we're going to place some street signs. Literally by saying at this offset-- at this coordinate, perhaps, offset a road sign. Add some more signs in. Using a catalog for lookup, and placing them automatically. And here we're now going to also place some crash barriers along the side of the highway. Everything based on the alignment of the road.
Now why was this illustration really important? Because quite often, the road alignment might move. In a traditional process, that would involve a huge amount of rework-- so reposition those lighting columns, crash barriers, and street signs. Here, if the alignment moves, the engineer just reruns the script and repositions them automatically.
Here's a similar example. In this case, a railway line rather than a road, where the Civil 3D user has placed a rail corridor, and also proposed where the station platforms might exist. Again, they're using the Dynamo script to place numerous assets such as signaling, overhead electrification, and other features.
And the model can also then be passed from Civil 3D. We can interrogate it in Civil 3D, but we can then pass that model also, as we showed previously, into InfraWorks to check it against the visual context. Does it look right? And if yes, we're all ready to go.
Now what we want to do is, we want to build station platforms. But rather than building them as a custom development, we're saying, well, let's use them as a rule as an offset from the rail alignment. And we can actually do this programmatically. So rather than modeling it independently, the Dynamo script here creates the platform feature correctly, based on the alignment. And again, if that rail alignment should move, those platforms can be repositioned automatically.
And what we've done here is, we've actually sent that set platform data, along with all of the other rail information we showed earlier, straight into Revit. So we've developed a very rich station model, completely using a mixture of applications-- Civil 3D, checking in InfraWorks, and now into Revit to do some more detailed development, as well as all of that scripting that we talked about.
So let's consider using open APIs and cloud. From an Autodesk perspective, we're really focused on our construction cloud solutions, allowing all phases of the AC lifecycle to be integrated, and connecting both those teams that are involved and the data as well, and using standardized workflows from design through to operation. It's all what we call-- like I said-- the Autodesk construction cloud with various modules for different processes and different participants in the process, but also making sure that we can link the model offering tools, and leverage collaborative working throughout.
And it's all based on our Autodesk Forge Foundation, which itself is working within the Amazon cloud. But Autodesk Forge is a set of components that we use, but we also share with our partners. So the Autodesk construction cloud isn't just an Autodesk approach. It's a way that we can integrate different service providers. They can bring their data in. They can access our information, and build on a core data at the center approach, as opposed to lots of federation of information and federation of data silos. We're bringing it together using this construction cloud and API approach.
And just by example, here's-- I mentioned Esri. We integrated into Esri data earlier, where the engineer in Leon, she pulled some data from an Esri backend source. In this case, we're actually within an Esri application. A thing that they're going to be launching at some point soon, called GeoBIM. And they're actually now linking directly into a construction cloud platform and viewing a full Revit model. So that model was in a basic shape space in the Esri application. But here we can pull all of the records. In this case, also exploding the building model. And we could go to any sheets and other information that's available from the Autodesk document environment.
Another quick example is from a UK company called Datum360. They similarly link into our model viewing toolkits and backend document management. And they're focused on ensuring data requirements are met. What we've done here is, worked with Datum360 to test using the asset data requirements I showed you earlier, from Network Rail. They were actually able to import within a matter of minutes, many tens of thousands of data requirements, and set up their class library management tool, called CLS 360, in order for it to be ready to test our projects delivering accordingly. But also give the project the tools to codify data correctly. So everything's been brought in correctly, and the users can now ensure that when they populate asset records, they're just doing it exactly against the specifications that come from the client.
Now moving to my final set of interoperability considerations, which is open standards. First and foremost, open standards aren't singular. There are many out there, such as sharing information for geospatial data, which is where OGC, the Open Geospatial Consortium might come in. Or LandXML, focused very much around infrastructure information. And buildingSMART International, I'll come on talking about them extensively later on. But they're very much focused around providing a data model for the construction industry.
So let's quickly look at LandXML InfraModel. I mentioned Vayla, the Finnish transport agency, had released a specification called InfraModel way back in 2014. And they extended the LandXML data model, adding missing information that became InfraModel. We were very quick to support the InfraModel extended requirements, using a plugin developed with our partner Symetri in Finland. And it allows import and export of InfraModel data from Civil 3D. And Symetri have gone on to include the InfraModel 4 extensions within their local Finnish version of their Navigate plugin. So anyone who is working in Finland can actually use Civil 3D to ensure they meet the Vayla data requirements using InfraModel.
I'm not going to go for every single open standard that's out there. But just safe to say, that there are many to pick from. And these are the ones I feel are the most appropriate to civil infrastructure. So E57, point cloud data, DEM for digital elevation models. CIS/2 has been around for many decades now, of structural information. ISO 15926 is an oil and gas data specification. And STEP ISO 10303 is very prevalent within manufacturing, and interestingly, also served as the basis for what became IFC, or Industry Foundation Classes that buildingSMART managed.
So what exactly is an IFC? Well, first of all, as I mentioned, it's developed by buildingSMART. It's a data model describing architectural building and construction industry data. So it's a hierarchical perspective on built environment. It is entirely platform neutral. It is a open format, so it's not controlled by any single party. It is very much object based. So we're talking about discrete entities within an asset hierarchy for that data model.
It is also in a human readable format, an ASCII format, express language that came from that STEP standard. And it is published in many forms, but principally, the STEP approach. And it's also been published as an official international standard. And the most recent version for IFC 4 was in 2018.
It does have some limitations, partly because it is ASCII and human-readable, requiring the file to be read from start to end each time. And it doesn't take a huge amount of imagination to realize that an IFC data model for a project such as a building, could include tens, if not hundreds, of thousands of discrete objects, which, if it's all within one single file, can take a very long time to read. And actually, that's been a key restriction in IFC, wide usage.
And so there is a big project going on to explore alternative ways. And there might be a new release of IFC coming out from buildingSMART in the next few years, that really tries to address some of those performance issues, and obligations to work in a file from start to end. And finally, there's also consideration about going to looking at those objects discretely. So having IFC as a database, essentially. But that's a-- watch this space in the future for buildingSMART.
What versions of IFC do Autodesk support? Well, we support all of the published current versions, IFC 4, and IFC 2x3 which is most commonly used. We do also still support IFC 2x2 in Revit. But really, only the receiver can't use 2x3, which is extremely rare today. So the most commonly used are 2x3, with 4 increasing in usage, particularly for building.
One other thing to understand about IFC is, that an IFC file is not really a file unto itself. Because you don't export the entire IFC. What we use is a model view definition, which is a subset of the IFC schema. And that's really there to satisfy a data exchange. And there are numerous published MVDs. Some clients specify their own MVD, saying, I want this subset of data.
I mentioned COBie earlier. The Environment Agency are using-- so COBie is a subset of IFC. It's a model view definition, really focusing on the information needed for construction operations. And you specify the MVD at the point of exporting an IFC file from a model application, such as Revit here. And in this case, we're picking the 2x3 coordination view extract.
I'm really pleased to say that Autodesk does support IFC in many of our applications Over 14 applications support IFC 2x3. And Revit is the main application support in IFC 4, will be recertified for architectural reference exchange and structural reference exchange in 2020. And we are completing the MEP reference exchange as well as architectural import. We also supported what was called IFC 4.1 for alignment within Civil 3D in 2015. And as it sounds, the alignment was literally a string line, perhaps center line of a road, curb line, edge of rail, as a basic referencing approach to coordinate infrastructure.
So, is IFC applicable to infrastructure projects? Well of course, it can be used. But if we look at the green to red scale on the right-hand side, it's probably fair to say that the greatest applicability for IFC has always been in vertical construction, so urban buildings, and not so much in infrastructure. Yes, it can be used perhaps for doing stations, as I showed in my videos earlier. But we don't really have the entities to describe things like highways or railway lines. Even pipelines are very difficult to define, because a pipeline within a building is extremely different to one that might be used for water or sewage.
So there's been a missing set of that data model for IFC for many years. And I mentioned, I'm a civil engineer, and I've been a member of buildingSMART since the late '90s, certainly active in the UK and Ireland chapter, and often lobbying for improved support for civil engineering. I'm pleased to say I wasn't the only person doing that.
And in 2013, the buildingSMART InfraRoom was established, with the mission and scope that you can see there. And it looks at trying to provide the data for bridges, roads, rail, drainage, and ports that have been seen to be missing, and limiting the applicability of IFC for infrastructure. The rail room was also set up in 2018, just because there was a huge amount of interest and investment from the rail industry to define IFC for rail. But InfraRoom and Rail Room cooperate continuously, particularly now during the development of what's called IFC 4.3, which are infrastructure and rail extensions.
So if we look at the infrastructure room, it is a organized hierarchical group led by a steering committee. That has a project steering committee looking after a number of projects. And the projects there-- we've mentioned alignment was completed in 2015, but has been since updated to include variations, particularly coming from the rail sector, and also improving definitions for bridges, roads, ports and waterways, and railway, through the Rail Room.
And there are other projects going on around tunnel. And the ports and waterways project itself is looking to almost extend into more maritime and environmental data requirements. I'm pleased to say I was elected into the InfraRoom Steering Committee earlier this year, along with a number of other new representatives from different organizations, from industry, owners. And as you can tell, also globally. So this isn't something that's limited to maybe Europe, but we do have representatives from different geos, as well as, of course, people from buildingSMART's own community involved in the InfraRoom.
And this group is leading that IFC 4.3 development. And who else is working on IFC 4.3? Well, here we can see a selection of the active participants. And again, you can see this is very much a global group of organizations. And if you look closely, you'll see that particularly in rail, there is a huge representation from infrastructure owner operators. So those people I talked about from the very beginning of this presentation, people who are specifying information requirements, are taking a very active and committed interest in IFC for infrastructure. Because they need it in order to receive better data coming from their project investments.
Let's have a look at how this works. So we've been working with HNTB in the US, who themselves are working on behalf of the US federal highways, who as we saw earlier, are beginning to take a very strong interest in specifying IFC for infrastructure, and making recommendations to the departments of transport. So we've got some base data. Actually it was some PDF drones, which aren't really that useful. So we had to take those and turn them into new models. We had to remodel the old PDF 2D CAD data.
We actually were able to use Google Maps to actually have a look at the physical asset-- in this case, the bridge in question-- because we didn't even know what it looked like. But luckily, we can use these new approaches to actually say, well, let's go and build that bridge. And we don't even have to visit, as long as we got the drawings as a base reference. So using InfraWorks, which is a fantastic bridge modeling tool, we were able to create this bridge and remodel it in a BIM rather than a CAD approach.
Using InfraWorks to Revit direct integration, which we showed earlier, we were able to take that bridge into Revit. And then we shared that Revit model using our cloud common data environment platform. Because this work has been done between Canada and Germany, with our partner, The Technical University of Munich. And TUM then took that data and applied it to their IFC 4.3 toolkit, which they've developed using the scripting approach, which we taught previously using Dynamo.
So they've effectively got a IFC 4.3 toolkit that can pull IFC 4.3 out of Revit. And they then generated a representation of that bridge in IFC 4.3, that is now being shown in the open source model view. So literally we've gone from CAD based data to remodeling in a BIM tool like InfraWorks, to then create an IFC 4.3 test app. And as you can see, I've also used pretty much all of the techniques for interoperability I described earlier-- direct integration scripting, cloud platforms, and finally, open source.
And one of the things I mentioned earlier, that you need MVDs to create IFC as an export against a particular schema. So a project is just starting now to prepare the IFC 4.3 model view definitions for infrastructure, which are critical in order to create suitable and usable exports and imports, but also to allow software to be certified. And this process is expected to complete early next year in 2022, allowing software certification to take place maybe next summer.
Just to understand, why do we need these extra MVDs? Well, buildings in particular tend to use a reference view that is coordinate-based, so x, y, z positioning. Civil infrastructure tends to use an alignment-based referencing, very much like I was showing placing of those lighting columns and crash barriers along the road alignment, using an offset basis. So the alignment approach is really critical to satisfy infrastructure usage. And so we need to create these new MVDs. And there will also be industry-specific MVDs developed as part of this project. But at least these two core ones will be needed in order to certify software for import and export.
Autodesk is not only involved in the IFC 4.3 development project. We're actually doing an awful lot of work at the moment to change the way we create IFC and import IFC. Last year in 2020, we joined up with the Open Design Alliance, and agreed to use their software development toolkit to replace our existing one within Revit. And we've already done an awful lot of work to ensure that Revit, and then there will be follow-on applications such as Navisworks, Inventor, and our Forge platform that use the same methodology for creating and reading IFC.
And the ODA solution, which is already in beta, can actually take the IFC 4.3 that we showed earlier from the bridge example, and read it directly into Revit, using the new ODA toolkit. Hopefully by the time this presentation is shown live, Revit 2022.1 will be available. And that means that the ODA toolkit will be available to everybody. And hopefully you'll be able to read the new IFC 4.3 entities, because ODA support them within their library. Even if Revit doesn't understand such things as bridge bearing plates or other bridge-specific entities, it at least be able to read the geometry. So that's a really good bit of news that's coming out now as I present.
We're obviously not neglecting our main civil infrastructure design tools like Civil 3D. And we're also developing a solution to have Civil 3D support IFC 4.3 in the very near future. Particularly, it should be available when IFC 4.3 is ratified, and those MVDs have been released and published next year. So Civil 3D is definitely going to have IFC 4.3 support.
And the long-term plan is that we're going to use that ODA toolkit to enable many other products to support IFC in a consistent manner. This will take a number of years, potentially. But hopefully our wide portfolio of modeling solutions and viewers will support IFC 4.3, as well as ongoing investments in IFC in the future, such as the tunnel project that I mentioned. And also, when I talked about earlier, the IFC enhancements that might allow it to perform differently to the current express language approach. So you should be confident that using Autodesk solutions to read and export IFC will be something that will be available to you moving forward.
So I can now come to my conclusion of this session. First and foremost, I think I've made it clear that, in order to ensure you can deliver to the information requirements coming out of clients, and those owners such as those that I showed at the beginning of the session, you need to plan. Planning is critical. You need to look at the options that are available to you from direct integration, scripted integration, to create those custom approaches. And leveraging the power of the open APIs and cloud platforms to link into other data sets. And of course, make sure that open standards are part of the process.
And again, as I said earlier, these aren't exclusive options. These are ones that you will probably consider on every single project together, to satisfy information needs, but also those workflows between the different actors within the process. ISO 19650, I think it's become the must-do approach to delivering consistent BIM. And we're seeing its adoption in many countries and by many clients around the world. So those are your kind of basic things that you need to look at. Please do look at ISO 19650 as a foundational approach to doing this consistently on all your projects and within your organizations.
So let's just quickly review. What does this mean for your asset data lifecycle? Well, if you do all of what I've just described, and hopefully you will actually have the right technologies and participants by each phase. So during the definition data, you'll be using the right people plus the right tools, to capture legacy asset data, to read existing information records, and start defining those information requirements correctly. And then during the data creation, you'll be using the right modeling applications. You'll be using the right data formats and tools to ensure you are providing data that meets requirements.
Then you can link that data. I think it was pretty clear that the Autodesk construction cloud brings in many different third party solutions. And again, IFC is absolutely core to one of the ways we can exchange information during the asset data lifecycle. And then in use, we can use a number of Autodesk tools, third party applications like Esri. And start building things like digital twins, where Autodesk tandem provides a platform to bring in multiple data sets and view them, and update them during the operational phase of assets.
So final slide. A little bit of homework for you. So if you've got interoperability concerns, take some of these actions. First, make sure you're providing correct BIM. Because if you don't start modeling and move away from a CAD-based approach, none of what I've described today will be relevant. For those of you who are new to this area, feel comfortable that there's a lot of information out there for you to learn our applications, such as Revit or Civil 3D, as well as others.
And there's even free courses out there. So academy.autodesk.com provides a lot of this free tuition. And if you are comfortable that you've developed a basic skill set, you can even go ahead and get yourself certified, allowing you to move from a CAD provisioning approach to a BIM-based provisioning. And get your teams all working to a consistent way using the BIM tools.
Please, please, please go out and learn about ISO 19650. Obviously you can go to ISO itself to get the standards. If you feel uncomfortable that you don't know how to apply it, then I'm very pleased to say, there is some really good guidance available. UK BIM Alliance and British standards have been involved, and are part of what's called the UK BIM framework, along with the Center for Digital Britain at Cambridge University. Go to that link. Huge amount of guidance around how to apply ISO 19650 in projects and in operations available to you.
Also talked about common data environment. I'm really pleased that within our construction cloud solutions, Docs is available. And we've had some ISO 19650 enhancements released in the last half of the year, with more to come. So do find out how you can use Autodesk Docs to provide your common data environment across your projects.
Talked about Dynamo. Might be a bit scary. Actually Dynamo isn't a tool that requires programming capability. It is a scripting approach. There's a primer available. Please go to dynamobim.org. There's lots of information and examples, plus scripts that you can use within your projects without having to develop your own. So there's some stuff that's already published, and a great community of people working on it.
If interoperability is something that you need to know more about, then we have a corporate site, autodesk.com/interoperability. We've got white papers out there that you can download, and understand a bit more about, what does it mean for you, and how is Autodesk supporting it? And we've also got a set of tools that we've created. Add-ons to applications like Revit to create good outputs. I mentioned COBie, there's a COBie toolkit. And that's also available either from the autodesk.com/interoperability, or for its own unique link of interoperability.autodesk.com.
So there's lots of materials, but also even plug-ins and add-ons that are available to help you create the right data, and check that you're creating the right data. And of course finally, please do follow the work of the buildingSMART InfraRoom and rail projects as well. There's the link to that. And it's really important that we get more people involved in developing these infrastructure schemas, so the infrastructure industry can use IFC on projects and in operations to fulfill needs.
On that note, I will close the presentation, and hope that I've provided some useful insight into how can you use BIM for infrastructure, and make it interoperable during the lifecycle of the project and asset operations.