Description
Key Learnings
- Learn how to implement automated workflows to extract and elaborate data from site surveys.
- Learn how to implement parametric planes in Revit from railway tracks as a path.
- Learn about how to use clash detection in Navisworks to estimate work in progress.
- Learn how to automatically manage and compare surfaces in Civil 3D to calculate cut-and-fill volumes.
Speakers
- LCLUCA CAPUANII'm a GIS analyst and technical specialist. From two years in FSTechnology, I started as a consultant on Autodesk Geospatial and Collaboration products. Graduated in Geography and specialized with a Master Degree in GIS, so I'm interested in everything that is "GEO".
ALESSANDRO DELLE MONACHE: Good morning, everyone. It's a pleasure to be here in Autodesk University 2022, with my colleague Luca Capuani. I am Alessandro Delle Monache, BIM GIS technology specialist. And in this class, we are going to explain how our workflow flow can increase the time saving or railway site management.
Here is the agenda for the presentation. And now, a brief introduction of our company and our team.
FSTechnology is the high-tech company of Ferrovie dello Stato Italiane Group. It was created at the beginning of 2019, and its goal is to strengthen and support digital innovation among the company, within the group.
The BIM and GIS Competence Center is a team within FSTechnology. The main objective of our team is to research and implement new technologies to improve the processes and the workflows for the management of the entire lifecycle of infrastructure projects.
Considering the processes of the group, we mainly support linear infrastructure project, and therefore, we support Italferr, which is the engineering company of the Ferrovie dello Stato Italiane group during the design and the construction stages. We also support Rete Ferroviaria Italiana, the company of the Ferrovie dello Stato Italiane owner of entire railway network. And here, in this slide, we can see our team.
Our first Autodesk University class was presented by our head, Marcelo Faraone and Stefano Libianchi in 2019. In 2019, [INAUDIBLE] awarded us with the special achievement in GIS. In 2020, our group started to investigate how to improve the integration of BIM NGS with other platform and implement solution for remote site monitoring.
We focused our energy to the possibility to reduce time on construction site management activities. We will explain where we start and what we achieved so far.
In July 1996, the European Commission adopted a resolution to implement the Trans-European Transport Network, named TEN-T. The intent of this multi-phase project is to provide coordinated improvements to primary roads, railways, inland waterways, airports, and traffic management system throughout Europe. When complete, the Scandinavian-Mediterranean corridor will stretch from [? NCT ?] to Valletta. The Napoli-Bari High Speed Railway Project is part of this corridor and started in 2015.
The activity of the project-- we will present our main focus in this section. The project-- virtual construction site management-- started in 2019 as a proof of concept. The main goal was to monitor, remotely, some phases of the construction, with the help of advanced technologies, and provide support to construction site manager in some of the most expensive and critical activities, such as construction, health, and safety checks, work in progress and quality checks of the works being undertaken, environmental inspection during construction.
The technology we use in this project can be divided into three major groups. The drone surveys activities where drones have scheduled the recording inspection of the construction site to monitor the phases and the status of the works. After the processing, [INAUDIBLE] data collected by the drones. The outputs are or Ortophoto and Ortomosaic, Point Clouds Models, BubbleViews.
The major goal was reached with a workflow that includes the post-processing of the images and the models acquired during the survey, the analysis in [? BIM ?] environmental, and then the publication-- [? ArcGIS ?] portal enterprise. In addition, to make the data available to the world project team and potentially to the world company, we also managed to estimate construction site processes and calculate custom field volumes, which we will explain in detail later. The site of data analytics, after integrating via BIM and GIS information, although [? acquired, ?] can be used for different purposes, such as AI for automated image detection, augmented reality, and for the environmental control.
And now, two more aspects for managing the data acquired. The AI algorithm was used to automate the analysis and reduce the times for identification of image. Also, we are training the machine with the Ortophotos taken during the surveys for environmental inspection on construction site to identify illegal landfill, dangerous leakage of chemicals, environmental contamination.
Augmented reality and virtual reality side-- with the help of Unity-- a cross-platform BIM engine-- in ArcGIS SDK for Unity, we managed to integrate GIS data and B models to obtain an impressive solution, which is very simple to navigate, engineered by all known BIM experts.
Our massive goal was the possibility to share this game-like application as a simple installer and allow two users to simply launch it and navigate in virtual environment with a keyboard or gamepad. In the environmental side, we are studying the possibility to analyze the [INAUDIBLE] and post-operam environmental system to verify whether they were preserved and ensure the protection of our landscape heritage. Moreover, we can detect contamination of illegal landfills throughout the use the AI algorithm, image post-processing, or with multispectral analysis with sensors carried by the drones.
The focus of this class explain in detail two of the workflow we created, especially to calculate cut&fill Earthwork volumes and to estimate the physical progress of the works on site. Before doing so, it's important to understand the reason that led us to implement them, the requirements for the data, and the information exchange and the necessary level of detail in order to get satisfactory results.
For the semi-automated calculation, they needed to isolate the volume of the soil involved in Earthworks on the construction site. This volume shall be generated with little processing for the end user, and the processing time should be short to ensure the results are timely and useful. The results shown should also be within a certain accuracy and can be used as data sources and database for further analysis.
Let's now have a look at how the workflow is structured. Before demonstrating the workflow, we released all the file types we requested at each subway. Ortophoto then processes in Ortomosaics with a minimum resolution of 2 centimeters per pixel in Tif and tfw format. These are intended to have a detailed overview of the world construction site area, and they feed the AI algorithm for automated image detection.
Point Cloud in Las Rcp format-- classified by both WBS and material-- for example, vegetation, ground, and concrete-- with a very high accuracy, with a precision of lower than 10 millimeters. And BubbleView-- made with a 3D laser scanner that allow us to usually inspect and navigate the site, as well as taking measurements.
As mentioned earlier, one of the requirements in the validity of Point Cloud classified by WBS, which means having a single file for each element. All piers, piers [? cap, ?] beams and, therefore, save it in separate files.
For the same [INAUDIBLE], we also need an additional file classifying by the different materials so that we can distinguish concrete, steel, timber, as well as the ground and the vegetation. For this PRC, we choose the construction site [INAUDIBLE] as construction was kicking off, and the construction works include different types of structure, such as viaduct, railway embankments, tunnels, and more.
On this file, I shared the storage on ACC, with folder-- we'd organize in a folder structure and read with the client, which makes it easy to identify the main WBS elements of [INAUDIBLE] and so on. At the lower hierarchy, the name of the survey has organized the data by date. And at an even lower level, we have all the outputs listed by format and type.
As we carried out periodic surveys on three major pieces of infrastructure, two viaducts, and a portion of railway embankments, [? and ?] well organizing for the structure allows us to easily organize and find the data relating to specific survey and area, and carry out comparative analysis of the same area to different surveys. Coming down to the stakeholders involved in this project, we have been collaborating with Autodesk Engineering, Microsoft, Seikey, and Esri.
And now in this short video show the main construction site area of the walls on the Cancello Frasso Telesino Railway. We regularly survey the site, roughly every two months, to monitor the construction works, including two viaducts, a new viability system, and a tunnel.
And now I defer to my colleague, Luca, for the technical construction of the workflow we develop and then use.
LUCA CAPUANI: Thank you, Alessandro. Hi, everyone. I am Luca Capuani, and my role in FSTechnology is GIS expert. Now I will illustrate, in more detail, the first workflow we created to calculate cut&fill for volumes.
The first step of this workflow is about the necessary data preprocessing that we will then use in Civil3D for an automated calculation of cut&fill volumes. Using one or more Point Clouds in the class format, classified by material, we manipulate the ground level. And if needed, we merge it to make it a single file.
We then prepare a single file shape showing the different construction site areas, with the data attribute adequately populated. Both input data must reference to the correct geographic coordinate system. Then we use a Python script built in an [INAUDIBLE] environment, which executes the following commands-- cropping out the Point Clouds around the boundaries of the site areas, extraction of the name attribute of the areas, setting of the output resolution, export a general reference term for each item, and renaming of the duties with the attribute value of the relevant area and the suffix indicated the survey it refers to.
We started from a Civil3D template where we preset the graphic preferences for the imported [INAUDIBLE] surfaces to be made low quality-- that is, with no triangulation-- in order to ensure a better performance that-- the one achievable with level curves. The template also presents one of the most widely used GIS Italian reference system as the default projection system of the project drawings.
In easy words, we open a new Civil3D starting from the template, and we only change the reference system if required. Therefore, we proceed importing the DEM generated from the Python script previously described. Then we use the tool Create Surface from DEM, and we load our reference surfaces.
Next, we load the surfaces of the same site-- the area that survived the different time. [INAUDIBLE] or the relevant surfaces are releases during the tool spaces, and are renamed the coding as required by our standard procedure to create them. Specifically, our standard requires the surfaces to compare-- to be named the same, but the suffix indicating the survey that is T6 or T7.
In the same way, we can now run the Dynamo script. Let's see its structure and the way it works. Exploring the section of the script, in the first one, split the text to DEM. The second sector lists all surfaces, splitting by name, and extracts the data from the surveys. Then the Data Extraction process begins, and produces a structured Excel file as the output.
This is the custom Python script creating the volumetric surfaces. This box generates the Excel report that is automatically saved in the same folder as the reference [? DWG. ?] Also, we are prompted with the dialog box informing us on the number of surfaces created and on potential warnings. The process is ending confirming OK on the watch dial. The only parameter to set in the Dynamo script is whether to open up the accelerator to visualize the results at the end of the process.
OK, while the volumetric surfaces are generated, the process is shown in the progress bar. Once generated, the volumetric surfaces are listed in the two spaces within the surface category. The volumetric surface just created keep the same name as the control ones and get both the names of the surface converted as a suffix. This way, it's possible to keep track of the evolution of the construction site.
The report extracted are a data structure that can be loaded into business intelligence tool and visually represented on a dashboard. We are currently working to incorporate this data with the main mission project dashboard so that the user can easily interrogate, on a map, the variation of the airport volumes over time.
Once the workflow was defined, as mentioned earlier, we have usable results and limited processing times. We had to decimate the input data by using the tools that reduce the density of the Point Cloud, preserving the geometry as well as possible. To determine the optimal resolution of the Point Cloud, we test the different options and settings.
The results showed that the most straightforward solution for our purpose was a 10-centimeter resolution, which, while ensuring sufficient accuracy, kept the processing time limited, the size of the resulting Point Cloud much more manageable, and the volume calculation more efficient.
Here is the evidence supporting our choice. As we can see from the chart, for example, the time saving in generating the GeoTIFF between the original and the decimated surface account to 85%. Also, for generating the surface in Civil3D, we move from 27 minutes for generating surface with a decimation factor equal to 5 centimeters-- to three minutes for a decimation equal to 10 centimeters, while keeping a very limited discrepancy in terms of the overall volume, which means no significant drops in the accuracy of the custom field volumes calculation. The benchmark used was the volume of a surface with the estimation equal to 2 centimeters, and that's a very high definition for the purpose of our calculation.
Now I present the second workflow we created in order to estimate the physical progress of the works on site. We have implemented this solution by means of Revit, Dynamo, and Navisworks. The workflow requires a 3D model, which, in reality, is not always made available by the designers.
Therefore, when a project follows a traditional approach, starting from a simple Civil3D path, and when a 3D Revit model is not available, we can face two different scenarios. In the first case, having the required time, and now it is possible to create the necessary families which, even if time consuming, is still regarded as possible. In the second case, it is required one basic family to build a simplified model upon.
In the fifth scenario, it is the automatic production of a true BIM model. The Dynamo script generates the true part of the viaduct, in an empty private project, sizing based on, consistently, with the plan and duration that extracted from the Civil3D project, which must stay open in the background while the script is run. It is necessary to have the required families-- [INAUDIBLE], built [INAUDIBLE], beams ready and loaded the Revit.
Let's take the family of the typical pier of the viaduct-- for the viaduct as an example. We modeled the piers as a family having as many types as the design indicates. We then create instance parameters to be able to control features for each instance-- for example, the plan rotation of a pier in Dynamo. In simple words, the script writes the direction of the path at specific point and assign that direction as an angle parameter to the elements located there. [INAUDIBLE] to manually rotate the element to write it to align it to the track.
We also integrated by parameters that we can control while in the family editor. These determine the nested families to be shown or written within each type, and therefore, the characteristics of each family type. Let's open up the nest-- the nested standard family and see that it has a more feminine nest within.
Or with many parameters depending one on each other-- and when editing the one parameter, the formulas determine variation, such as depth of the foundation piles, the bottom levels of the piers, date of the piers itself as imported from Civil3D. And the top [INAUDIBLE] is determined by stating of a project-specific value from the [INAUDIBLE] parts.
OK, let's start from the Civil3D project. With all the necessary data, to make the process work with Dynamo, we need the [? bid-- ?] the different reference or phrases, and it provides with the innovation [? level. ?]
So let's take one of the different bets. Moving on to Revit, it is important that the project share the same coordinates as Civil3D. In this case, the coordinates had already been acquired from the CAD file. The summary point is perfectly coordinated, too.
Let's open the Dynamo and analyze the script, taking the global model as an example. In general, we see the section of the script. The first part is about the input data, the second about the parameters. The user needs to set the graphical user interface, then another section for modeling the necessary elements already-- and lastly, to fill in the parameter.
Precisely, in the input section-- the input section executes the following commands-- extracts the image and the code from the element to place, applies the rotation value, as calculated from the information extracted from Civil3D, extracts other information, such as the dates necessary for the 4D programming and the WBS codes. Lastly, for coordination purposes and a correct spatial placement, it shares with the same coordinates as in Civil3D.
The section ending the graphical user interface commands is made up by a series of Python scripts prompting the user with windows and menus to select the relevant families to the element model. And the last one is the interface to the path.
In this section, the geometrical characteristics of the Revit element are defined. There are three different groups of nodes-- each one related to the type of functional inner element of the viaduct, entering the data to, please rotate inside the family and its nested families.
The last part relates to the data manipulation extracted from Excel-- therefore, the input of parameters such as WBS, activity ID, date, et cetera.
Let's [? find ?] the Dynamo script, as mentioned already. The user is prompt with a series of graphic interfaces. Here, we can see the user selecting a viaduct family and pairing the codes with the respective families loaded the Revit. Then we had to do the same for the families of each type-- the second viaduct and its beams, piers, et cetera. Lastly, we can see that-- that selected relevant part from Civil3D is required.
Here, we can see the results of the Dynamo script. A true BIM model is, therefore, created. Every family got placed at the right location, with the correct rotation, dimensions, elevation, and with the WBS parameter already populated, with the relevant codes, such as the WBS identifying, unambiguously, the elements of the infrastructure. Each element is an ID for both activity and date, which can be retrieved in the text box by selecting Different Features. And then the file is exported to be, then, downloaded-- loaded in [? a resource ?] and clash detected against the Point Clouds.
This video opens up on our Revit view, showing the families created by the screen, and the progress of the construction-- a different surface between time two and time seven in this example. What is yet to be built is shown with a transparent green color, while, in the dark red, the older elements, and, in lighter red, the elements built more recently.
The other option is building a simplified model. In this case, there are no complex families in Revit, only boxes with a limited date, which we'll call planes-- or horizontal planes, for simplicity. There is only one family with as many instances as necessary. It has parameters that allow any family instance to adapt to different project needs-- for example, different dimensions of the [? piers. ?]
This family has only one family within. A smaller box that's repeated many times determine a volume made up by progressive slices. By opening up the family, we can indeed see that it is made up by nested family boxes. It [INAUDIBLE] as a parameters controlled at the family level so that every parameter variation within the family is reflected in each instance.
A Dynamo script allows us to apply, to the slides, a name which combines the [? WBS ?] code and the progressive elevation of the element. Therefore, the process is compatible, but the WBS values, with their special and dimensional information, are different, as well as the use of boxes acting as progressive lines across derivation.
These are useful to estimate the percentage of completion of the vertical elements once related to their overall project date. The zero datum is the top of the head of the rails, and the datum grows downward, giving an unambiguous name to each element. Lastly, the Dynamo script generate the planes and rename them according to Excel file after selecting the relevant path. The user only needs to interact manually when some unforeseeable events happen-- for example, multiple tracks or interchanges.
Before exploring in the process of Navisworks, let's have a look at the logic behind. In this example, during the first star, the drone scans the elements under construction, and we get an idea of the physical progress made. Being infrastructure projects are mainly linear, but [? punctual ?] piers spanning vertically, and beams spanning horizontally, we thought we could use control plans, boxes, slices along the length of each element to estimate the relative progress of each WBS element. In this example, after post-processing, the results of the first survey show that two foundation, two condition planes have been built 100% and 50% of the rate.
During the second survey, we can see that some progress has been made on all planes and on three piers. The results of the post-processing show the quantification of the physical properties on each of the WBS elements surveyed.
Finally, during the third survey, the drone scans the same elements again. And the results show that two vertical elements previously surveyed have now reached completion.
This is the script for the simplified model. The result is a series of block made up by the planes necessary to the clash detection in Navisworks, in place of having the tailored project being modeled. Therefore, the areas representing the elements are identified-- slice at top level, with the unique code combining WBS in relation. Lastly, the model is exported in this form for this direction.
This is the workflow about the railway embankment, starting from a Civil3D project. We need the bid and the revision, plus a series of [? contrasections. ?]
Moving on to [INAUDIBLE], we need to make sure the project share the same coordinates as Civil3D. In this scenario, the Dynamo script is quite different and more complex because the plans for each block are not sufficient and some sort of control over the sites is required, too.
It's necessary to have blocks along the elevation as well as on the sides to check the extent of the work executed. Therefore, it's needed to model objects according to the embankment typical cross-section. These objects are currently modeled in Dynamo memory through the input of some parameters. However, we are aiming to model a series of typical sections to standardize the process and allow the user to simply select the relevant typical cross-section from the base.
Here, with no further graphic interfaces, we run the Dynamo script. The result is similar to the viaduct. The difference stands in the fact that there is not one single slide for each level, but it is split in more slides. It's useful to check whether the sides of the embankment are consistent with the project section.
Here, each element is identified unambiguously. Similarly to other cases, the model is then exported to NWC to run a clash detection.
Once built one of the models and performs the export in NWC, we can open up Navisworks. Here, we will demonstrate the process on the simplified model of the viaduct as an example. Let's add, to the selection tree, the Point Clouds with all the RCS-files classified both by both WBS and material.
And then we add to the simplified Revit model with either its horizontal progressive boxes or slices, and the vertical ones, too, if necessary. Once imported, we verify, with a visual check, that all the Point Clouds fall within the volumes of the planes. With an XML import, we load the preset and automated clash matrix, with tests organized by WBS and material.
The clash matrix is built from a series of preset clash tests, saved in an example and loaded in Navisworks to run the clash detection. The tests were said to validate that the geometry is classified by type and level. Basically, a clash test validates the list of the selection sets of all the levels of a model against the selection set of a WBS type.
In this video, we are looking at-- at the pier caps, for example. We keep creating a clash test between the levels and the selection set of every WBS type and material. We then export all the tests, so defined, as a standardized clash metrics to be able to use it in the future.
The selection sets that are creating as the results of a search set filtering on a specific WBS site-- for example, the bids which are coded Wi-Fi and across all Point Clouds. The same process is followed for every WBS site-- with caps, slabs, beams, foundations, for every innovation plan of the model, and for every material-- timber concrete steel.
Since the naming convention and the five formats are standardized, the setup just illustrated this to be manually performed only once. When all the sets are defined, we can indeed export them as a standardized XML that can be reimported in Navisworks to easily perform the same exercise with the new service, or even a new project.
Once the clash detection is run, if the points of a Point Cloud intersect the block or the slice, we have the clash. Most tests will result in a series of clashes-- not really easily readable, as the point-- the points clashing against the planes are not very meaningful when reviewed in isolation, as these only means an element intersects one or more slices.
Therefore, it is necessary to export an XML file to feed a bespoke algorithm able to organize and interpret the results to extract useful information. The script combines the clash with the information of the Revit planes involved within the clash to determine the maximum progress of each WBS element on material.
One goal of this workflow is-- automatically, the feeling in the project is [? scales. ?] I use it to formally record the progress of the work on site and consequently pay the due invoices to the main contractor or in the trades. It's easy to understand that the added value of the workflow is that it simply automates existing procedures and blends in with the existing documents with no disruption or changes for the users.
So to fill in the schedule, we need to elaborate the example generated in Navisworks to simplify it by grouping the clashes, removing any duplicate or redundant clashes, and create a second procedure to-- to fill-- a second visit-- to fill the schedule in adding the values of the top of every WBS element-- paste or partial build and survey, and the percentage calculated against the project highest [? elevation ?] of each element.
Recently, we also added warnings for any unexpected results to make sure the user double checks the specific element and analyzes the reasons for such results-- as an example, anomalies such as progress values inconsistent with the results of the previous survey, or material survey with an unexpected sequence [INAUDIBLE] project.
OK, that's over. Thank you. And I'll give the floor back to my colleague, Alessandro, for the conclusions.
ALESSANDRO DELLE MONACHE: Thank you, Luca. And now to summarize all the progress we made with this workflow.
We estimated that, in this case study, an overall time saving of 40% in site supervision activities for the processing times. In particular, we increased the cut&fill accuracy with the use of scripts that helped us to automate the entire process so to obtain faster and more reliable results and planning, less prone to human error. Another important goal is the possibility to save time in the physical progress, monitoring with remote inspections, reducing site visit, and therefore, costs and potentially accident.
As a further benefit, we increased the data digitization efficiency, thanks to the possibility to store in the same place all the files of the survey, and thanks to the use of AI algorithms to automatically analyze and classify the content of thousands of site pictures, as well as we are studying new steps to integrate in this workflow in the next future.
In the AI side, we are introducing the possibility to resume or to publish data throughout the dashboard or throughout the Power BI. The augmented reality, virtual reality side is very interesting-- the possibility to use new technology for immersive experience. In environmental contexts, we have the possibility to verify, with multispectral analysis, the presence of underground waste, health state of vegetation.
Thank you, and I hope you enjoyed the presentation and our work. Thank you for watching.