Description
Key Learnings
- Learn about the business initiatives and pain points, KPIs for projects and users, and value added to customers.
- Learn how to use Autodesk Construction Cloud Connect and Autodesk Platform Services to extract Autodesk Construction Cloud data and create data pipelines.
- Learn how to drive BI insights from Autodesk Construction Cloud modules like RFIs, Issues, Submittals, Forms, Assets, and more.
- Get recommendations on products' strategies for diagnostic and predictive analytics.
Speakers
- SRSteve RuffSteve is a self-taught software developer with a background in control systems programming, audiovisual systems design, systems commissioning, BIM, and data engineering. He specializes in developing personalized software solutions and automation workflows that enhance the functionality of Autodesk products, enabling clients to achieve significant efficiency gains. Steve holds a Bachelor's Degree in Electrical Engineering from the Georgia Institute of Technology and is a LUMA-certified practitioner of human-centered design.
- Liang GongHe is a structural engineer by training (PE) with a background in preconstruction/estimating, construction management, BIM/VDC and data science. He helps customers leverage the data they produce through the design and build process to generate actionable insights including forecasting and scalability. He also automates customized workflows with ACC Connect and Autodesk Platform Services. After graduating from Duke University, Liang is currently working on his second master's degree in Applied Data Science at University of Chicago, focusing on AI/ML as a part-time student.
ANDREW DAVIS: Hi, everybody. Welcome, this is Andrew Davis here. I'm With the team from Jacobs and Autodesk. And we're going to be looking at some operation cases for automation and analytics today. Next slide, I guess.
This is a general safe harbor statement, I guess, that we'll put forth from Autodesk to cover off the content we're dealing with today. I think you're all familiar with it. Next slide then.
There I am. So this is my bio here. My name is Andrew Davis. I work with Jacobs as their regional technology lead in Canada. So managing a team of digital delivery leads, we look to advance Jacobs Digital Solutions in a variety of markets and looking to basically support our project delivery and construction and clients in their digital journey. Maybe next slide.
HODA SAFFARI: Hi I'm Hoda a BIM manager and digital delivery leader at Jacobs. I've been working within AEC industry for over more than 13 years. And over the years, I had the opportunity to lead and manage various BIM team projects in sectors like health care, institutional, mission-critical industrial facilities. My expertise lies in implementing BIM standards and digital workflow while working with the project team and client in order to elevate project with the latest technology. Next slide, please.
STEVE RUFF: Hi, everyone. I'm Steve Ruff. I'm a technical consultant with Autodesk based in Atlanta, Georgia. Prior to joining Autodesk, I worked as an audio visual consultant within the AEC industry, where I gained experience working with AutoCAD Revit and Autodesk cloud platforms and their APIs.
Since joining Autodesk two years ago, I've created custom solutions and integrations for customers that enhance the capabilities of Autodesk desktop products and cloud platforms enabling them to draw meaningful insight from the data stored in Autodesk Cloud Infrastructure. Next slide.
LIANG GONG: Hello, everyone. This is Liang Gong, senior business consultant focusing on analytics and automations at Autodesk. I'm a professional engineer licensed in civil engineering, but now I work more in the tech sector specializing in analytics and automations for Autodesk consulting. Meanwhile, I am a part-time candidate in master of science in applied data science focusing on AI and machine learning at University of Chicago.
ANDREW DAVIS: Excellent, Thanks, everybody so this is Andrew Davis back here again. And I'm just going to run through the first few slides. So looking, basically, at data and analytics in our environment, there are some general industry trends that we're sort of heading or pushing our environment in this direction. And some of those are here on the screen.
So 95%, it's data goes unused on engineering construction projects. And this is some industry reports that have been put out over the years. So there's a lot of opportunity, I guess, to look at collecting data in a different way as a structured data and leveraging it on large-scale projects for analytics. Basically, we're trying to put systems and processes in place to capture structured data and leverage it on project delivery, a lot of opportunity.
58% of owners, it's said to be looking at even design build approaches in the future, instead of your standard, sort of design, bid, build. And this also opens up a lot of opportunity for data and leveraging it through a course of a project. With design-build and EPCM projects, there's a lot more opportunity in that kind of a partnership arrangement with the different groups, design and construction and enabling us to be able to share data across those platforms.
Also, 81% of owners and operators are looking to make decisions driven better on data and looking to leverage data for their use. So it is a trend that we're seeing, as well, with owners, where they're seeking, not only just project reporting that we're giving them but also really looking at the ability to analyze and interact with that data. And so that's something that is also kind of pushing us in this direction.
In general, big data and analytics on the AEC industry, it's very much trying to be able to leverage that information and reporting. There's been in examples in the industry, somewhere in the neighborhood of an 8% increase in revenue when companies do focus on that type of effort. And that's really where Jacobs is heading there, as well, trying to look to digitize our systems and move in this direction. So maybe we'll move to the next slide.
So why does the AEC industry really need a data strategy or a drive for improvement? The reality is that we're trying to leverage technologies and leverage data, but the systems aren't really there in place to use them, a strategy needed to break this pattern of traditional data collection.
The strategy there is needed really to force or push us towards digital transformation, in general, and really look at the different traditional approaches in design, procurement, construction, and maximize the effect of collecting data in those environments. So ultimately, standardizing on a digital transformation approach, is going to get us to that goal. So we can look at the next slide here and see how that might work through.
So as we go through this slide, there's a variety of opportunities, I guess we'd say, in the different kind of sectors to grab data. Again, all things that are being collected during normal operations or normal project delivery, but really, just done in an unstructured or manual way.
So we're really focusing on these different environments, whether it's design or pre-construction or during construction itself, and even to the owners, operators, or handover to really start transitioning from traditional approaches of collecting information into leveraging data and collecting things in a digital environment. So by doing that-- we'll move to maybe to the next slide.
Yeah, so really trying to focus on what that looks like. One of the big problem is there's a lot of opportunity to collect things in a more digital approach and be able to manage that data and use it for analytics, but really, whittling that down to a point where you can actually have some predicted outcomes or focused outcomes.
So really looking at what do you want to do? Do you just want to manage data in general? Yes, that's true, but you really want to focus on maybe a certain aspect like safety, and then even narrowing that down further to understanding what you're really focusing on about the data. In this case, maybe incidents that you're really searching for. And then the real goal, the outcome that you're really trying to transform in this digital approach, is that specific category around safety incidents.
So by doing this for all the different approaches of traditional data collection or information management, you can really start to whittle down to the point where you're starting to define usable outcomes or challenges that you can accomplish. So we'll move to the next slide then, I guess.
And really, with Jacob's, and working through this team, as well, some of this digital journey that we're on and trying to capture this content that we deal with every day on projects in a more digital environment, we kind of circle through these different stages. A key point is, in the past, we've had a lot of our teams have started out with CAD and moved into BIM. And by leveraging those methodologies, we've been able to really advance things in our delivery.
But moving past that, again, into this environment of working with data, you're really starting to look at new skills in your groups and your teams. So skills that sometime are required more, in this case, are staff that are going to be able to use C# and Python and be able to do programming, work with these APIs that we're dealing with some of the technologies to be able to access the data, and then really try to deal with database administration and visualization.
So by leveraging those types of resources in a team, we're able to really kind of look into and explore the data. I would say too, that a big part of the journey that we're on is really dealing with discipline champions, so representatives from various aspects or disciplines of design and construction so document control, project controls, cost estimating, construction management, safety.
We really work with representatives in all those groups to really define what the key aspects are that we are doing traditionally when you work with regard to information management and how to digitize that environment so that we can actually use data for analytics.
And so a process is written out here on the screen. It's working through and saying like what do we need to digitize? And then once we get to the stage of digitizing it, then we actually are able to collect it and keep it organized, standardizing on how we do that, and how the data is actually collected in a certain way in a certain format. That gives us our efficiencies of how to be able to use the data.
And then really, the last three sections three, four, and five are about analyzing that data, and then being able to go back and forth through integrating and predicting the use of it. So that keeps going through a circle where, eventually, we get to a point where we're able to really start to refine the process.
But I would say a big part of the work, and I think, part of the examples we're going to show today, are really around looking at what we do traditionally manually and being able to digitize that and then leverage the systems that we work with to really be able to standardize and actually work in a more data analytical environment.
So with that, I think we have a few examples here that we'll walk through with the team demonstrating, I think, some interactions in the platform with Autodesk Construction Cloud. And then also, some advancements or some digitization that we did around our administration tasks and also some of our construction management processes. So with that, I think Liang is the next slide. You can take it away.
LIANG GONG: Yeah, absolutely, thanks, Andrew. So Andrew talks about the process of digitalization. And on this page, we're looking at analytics only because this presentation has two big parts, automation and analytics. For the analytics alone, and these are the evolution procedures process for analytics.
As you can see here, from the basic descriptive analytics to diagnostic and then evolve to predictive, and the final phase is prescriptive. It's like building a house. If you have a great foundation of the house, which is descriptive and the diagnostic analytics here, it's going to help you to build higher structures than the building facade, which are the analogy to the predictive analytics and prescriptive analytics here.
And in this presentation, we're going to mainly cover the fundamental phases, which is the descriptive and diagnostic, but I'm also going to give a few examples for predictive and prescriptive, which is more related to AI and machine learning algorithms.
OK, in order to build those analytics spaces, we need a solid, data storage environment because nowadays, a lot of the folks are talking about CDE, the Connected Data Environment. A very simple example of CDE is the database, like, the SQL database you are using, the Snowflake database you are using, whatever database you are using. But a lot of the foundations are the same.
As you can see on the screen on the left side, on the data storage environment, you are probably very familiar with the cost modules, operations module, schedule modules, design modules that you use on a daily basis on the construction side or in the engineering office. But you're probably not aware of how this data are stored in the professional storage environment.
So a lot of this are related to data engineering work. How to turn this normalized tables for different silos for the different services into consolidated tables, make them in a way to be ready to be produced by the end users on the right side. Because on the right side, it is more targeted towards the end users. We you use this data for forecasting purposes, for diagnosing purpose, and for a different purpose.
But in order to reach to the right side, you got to have a very solid foundation of the left side, which is turning the silos data into a database and consolidating them perform data engineering in order for them ready and organized to be used.
The data connector, by using a very simple example from the previous page, like right here, the different silos data, and in our environment, under Autodesk, if you're familiar with our ACC data connector, on the right side, all of our tables are organized in a way by services on the front end of the product. And we call all these normalized tables.
These tables are not ready to be saved by the end users. But they are the professional way and the most efficient way to store these services data in the database, for example, mySQL database.
And we also attached the links here. You could open these links to understand more of these workflows, how to use our data connector, how to use the templates of Power BI that we provided together with this data connector. And this is a kind of a pig or one of the very small aspect of the common data environment, the data storage environment, that I was referring to to the previous page.
Now, we're going to go through three real problems that Jacobs have been solving with Autodesk. I'd like to invite Hoda to talk about the first problem, the problem statement
HODA SAFFARI: Thank you, Liang. So in this case study at Jacobs, we have numerous project on ACC and BIM 360. And we constantly work with countless user internally and externally. So we wanted to have a complete overview of user services, roles, and affiliated company across our entire projects.
This has been quite a challenge for us because currently, the only way that you can do that, is go to each individual project one by one and get the required data, which this can be really time consuming and also it can make it really hard to keep track of everything.
So we determined to create a centralized system, maybe, a one-stop hub where all user account could be managed and be able to track it. And we wanted to make our workflow more streamlined and manageable. So to make this happen, we collaborated closely with Autodesk team to overcome these challenges, and be able to develop a high-level report. You can go to the next slide. Thank you.
LIANG GONG: Yeah, thanks to Hoda for stating the problem that Jacobs was facing. We understand there are limitations on the product UI, the product User Interface on ACC. But however, there are always workarounds by working with as Autodesk consulting, because we understand how the background is working.
So in order to solve Hoda's and Jacob's problem here, what we are relying on is to rely on the data connector, which I just showed to you guys in the previous slides, but using those normalized tables here and consolidated them. And I figured out the relationship between those tables, especially for the project user services, project user companies, and project user roles because these are the three tables and the information that we want to retrieve from form the account level to the project level.
So that's why you can see here, for all the account user list in this table, by clicking on each user, the specific users project user services information, project users roles information, project users component information going to be popping up for that one specific user.
So this is how efficient it is. In that case, you don't have to go through each project and click through each user to figure out this corresponding information from the current level to the project level, which saves a lot of time for our end users.
So in order to achieve that, as I was talking earlier, it was relying on this data connector. And what is data connector? Just reviewing it again, it saves all those silos and services data here. And we could schedule the runtime for data connector because you always keep down the engineering information into ACC. What we're doing here, is that whenever you have new information input on the UI, after you run the extraction based on the automated schedule, all this new data are going to pop up into Power BI or your database.
And we also have a Autodesk Construction Cloud Connector, which is certified by Microsoft in Power BI, so through that, you could directly bring the data from ACC onto your Power BI for analytics.
Next, I'd like to invite back Hoda to talk about the second problems. This one is more specific, and we're also going to talk about more API-related things about this problem. Now come Hoda.
HODA SAFFARI: Thanks, so here at Jacobs, in our large-scale project, we deal with diverse mix of users and companies, each needing specific roles and permissions across countless folders and subfolders within our ACC or BIM 360. So the challenge arises in keeping track of all these user permissions across all these folders and subfolders and through the whole length of the project.
Currently, within the ACC, the only way to achieve this and get all this information is go through all these folders one by one and get the data and the permission of each folders. So this can be really inefficient. It can be really time consuming, and it can make it really hard to keep track of all the new users, existing users, and go through everything within all these different folders.
Recognizing this challenge, we partnered with Autodesk while collaborating with our document control team and with our project managers, in order to come up with the central hub, so we could be able to track and manage all the user roles and permissions. And also, we want to have a hub or some sort of central location in order to search within different folders search bar by the user, search by the roles.
So in order to achieve that, we constantly collaborated with Autodesk. And we wanted to come up with the solution that works for our project and scope of work. And we wanted to make sure that everyone has the right access to the necessary folders based on their project responsibility. And we wanted to be able to manage everything within one central hub. Thank you. Next slide, please.
STEVE RUFF: Yeah, so to solve this problem, we created a web application, which harvests all of the ACC project folder permission data and exports it in a format, which can be visualized in a Power BI dashboard. This web application is written in Python using the flask web framework. And it's hosted in Azure.
It uses credentials that are of an associated APS app, Autodesk Platform Services is APS, which has been added to the customers ACC hub, for the purposes of authentication and for access to project data.
Once the application is authenticated, then we can call the APS permission APIs to collect all the folder permissions data. This API endpoint is actually compatible with both BIM 360 and ACC projects. So we can use it both on the previous generation BIM 360, and on the latest ACC projects. Next slide.
So here you can see a lot of more detail about the steps that the automation performs. First, when we're configuring the web application, and initially, the user specifies whether the data should be exported to a SQL database, or to CSV, which will then be downloaded by the user's browser.
Then once the automation authenticates in the background using the associated app credentials, then the user will navigate to the web address of the application and select an project on which to run the automation. When the user clicks run, the application recursively selects user permissions for all the folders in the project.
And there is a difference that I want to point out between how the user permissions are named on the ACC website, and how user permissions are returned from the permissions API endpoints. So to make the terminology understandable for the users of the dashboard. We translate the combinations of permission actions that are returned from the API endpoint into their equivalent user-facing permission level names that are used on the website.
Finally, when the data is output in either CSV format, which is downloaded by the user's web browser or to the configured SQL server, then the process is complete. So as a proof of concept, the solution, it operates on a single user selected project. However, this solution could easily be modified to run on a schedule, with no user interaction, to harvest all of the data from all the projects in the account. Next slide.
And I just want to touch on an alternative approach to solving this problem that we initially attempted. Initially, we tried using ACC Connect, which is based on Mercado. ACC Connect is a low-code platform as a service environment that's built on Mercado that allows users to create automations using pre-configured apps that implement cloud platform APIs.
It's a great platform, and it's a good fit for a lot of problems. The limitation that we initially ran into, which caused us to abandon this approach, was that recursion through an unknown depth of folders was not obviously possible at the time using Mercado.
Since implementing the solution in Python, we have later found a workaround for this recursion problem in Mercado, and we now believe that the solution could have been successfully implemented in Mercado or the ACC Connect environment. If you're not a programmer, and you're interested in, what types of automations you can create using the ACC Connect platform, please reach out to myself or the ACC Connect product team. Shout out to them. They're all awesome. Next slide.
LIANG GONG: Thanks, Steve. And based on the great process, basically, Steve said about ETL, Extract Transform Load process to harvest the data from another cloud through an APS APIs. After all those data got into a SQL database, the next mission is really to visualize the data in a way that is very easy to be digested by the end user.
As you can see here, we're trying very hard to mimicking the user interface. You're probably familiar with here. This visual, it looks so familiar. Similar to the folder structure on the ACC Docs files section. Like if you're clicking through this folder, if you click on one of it, it's going to tell you if there are more subfolders. And are the users that could access to this specific folder and what are their user permissions.
That's one way you could filter here, but you can also filter by the folder name here directly here. For example, I'm interested in the folder that contains the name DWG. That's what I'm putting the keyword here in order to find that folder. Or I'm very interested in one specific user to see which folder that one specific user has access to.
For example, if I put my name here, GONG, my name pops up. If I click on my name, and then all the folders that I have access to is going to show up here, and it's also going to show the corresponding user permissions here, as you can see here. It's exactly the same as what you're seeing on the user interface on ACC files Create, Create Upload, Edit Manage, View, then press Download.
So on this one single page, it literally decrease the time that Hoda and her team has to go through on the user interface clicking through each folder, figure out what are the users, and you don't know if that user also has permissions to other folders until you click through other folders.
And also in a more smart way, we're doing this folder's tree structure. This was basically, another version of this folder structure, the hierarchy. It is just more straightforward putting the tree structure. And you could keep clicking it through, and it's going to expand it to the sub-folders. And once you click on a folder, and all its users' permission is going to show up.
And this is a tabular view, which gives you all the information-- the folder's information, the folder's path, and the associated users and the user's permissions under that specific folder in a basic table format. I'd like to invite Hoda back to talk about the third problem we have been working with Jacobs, which is a very interesting one.
HODA SAFFARI: OK, so here at Jacobs, within our big-scale project, we have numerous submittals. And our team had to manage all the submittals weekly. So for one of our project, we worked with ACC submittals. And our team needed to generate a detailed weekly report to keep track of all the submittals.
But all those submittals had to include specific bluebeam link. And we wanted to have all those link as a hyperlink within the export file format. So we encountered a challenge in integrating these links into all these individual submittals and ensuring seamless export into our weekly report.
So to tackle this, we explored so many different options. How we can achieve this within ACC, what option do we need to use in submittals, and how we can come up with the reports that have all this bluebeam link assigned to all these submittals, and how we can automate this project?
So we partnered with Autodesk and our data analytics expert, and through all this collaborative effort, we overcome this hurdle and successfully created an automated report. Next slide, please.
LIANG GONG: Yeah, thanks for stating the problem, Hoda. And the solution we're trying to identify here because again, admittedly, there are limitations on the Submittals interface. There's no way you could create a customized attribute for submittals, not like issues. If you recall, you could create custom attributes under issues. But there is no way to do that under submittals.
That's the unfortunate part, but the fortune part is that you could add references very easily under a submittal. So here, we add a reference on the issues. And we embed that bluebeam session link under the issues description. So we figured out a way to input the bluebeam link session on the UI.
And also another fortunate thing, is that we could capture this information through data connector, and also, we could harvest the relations between this issue and this specific submittal item on the back-end through data connector. So this is the workflow how we solved this problem. And this is the dashboard.
After we're harvesting the data through data connector, and we link the issues with the submittal items together and grabbing the bluebeam sessions link here, as you can see here from this screenshot, you could link back to the ACC submittal item, also hyperlink to the bluebeam session, and all the related information regarding a specific submittal item is going to be listed here with the spec's name. What's our priority? What's the submittal due date? What is the expected material on job site date?
All here, et cetera it's because of the limit here. That's why it's not listed. But literally, you could harvest every single data point from the UI onto this dashboard. This is the powerful part of this dashboarding with automated workflow by leveraging ACC submittals and ACC issues on the product UI.
Not only for one project, after we tested one project with this workflow, we're able to expand it to other 65 projects under Jacobs' hub. And it's like you test it out on one of the project and works well, and then we're expanding to all the other projects.
And the time we're using to implement this workflow on all the other 65 projects, is cut off very like to a large extent. So it's very efficient. It's very scalable. That's the point here. After we implement our one project, it's going to be scalable to a lot of other projects, which brings a lot of values here.
After we walked through these three problems, we're going to give some extra statements before we closing out this presentation. So for stats analysis, the reason I'm pointing out the stats analytics is that it is true. It's a trend. We're talking about AI and machine learning common data environment all the time.
But the foundation, come back to the foundation. Again, are like math, right? Instead of math, stats is the most important thing. Using the analogy here, it's like building these building blocks.
In the beginning, they're all over everywhere, and then we sorted them. And then we arrange them, we present it visually, and then we build a beautiful house with it with a data story. We're able to contribute to each phase of this evolution. And I'm going to give a few examples later in the next page.
One of the phase is getting the data sorted, arranged. That's why we need a powerful data storage. If you come back to the earlier page. If you remember, we talked about there is a page talking about common data environment and how the data get together in a database.
So right now, one lesson learned that in order for the previous problems, we're relying on data connector, which is this route from ACC cloud to Power BI. But the next phase, working with the Jacobs stack, we're going to build a semantic layer between ACC and Power BI, which is the data warehouse, for example, SQL Snowflake, in order to improve efficiency of this whole workflow.
Because Power BI is not really a data storage tool if we're organizing or consolidating the data, data engineering the data, everything in Power BI, it's largely slows down Power BI's performance. That's why we're adding the semantic layer here. The benefits are here. You can read it.
After we're putting all the data in the SQL database, for example, we're using Snowflake as an example, instead of performing these data engineering steps in Power BI, we're doing in Snowflake.
Just giving a very quick example for the Apply step here in Power BI, it could take two hours when your data is huge to wait for each step to be finished in order to proceed to the next step for the data consolidation procedure. But in Snowflake, it takes like 30 seconds. So it's really 30 seconds versus three hours. You could imagine how much time is going to be saved during the development or later operations by using this workflow where I'm hovering here.
OK, the final page going to talk about more predictive analytics. So after we're building these solid foundations with the added database, we're going to think more about the real data science work.
For example, we could predict the issues priority level here. There were a lot of issues under your ACC issues. They're superintendents a lot of time on the site. They just pick whatever issue they feel like the most urgent one in a subjective way.
But by using this circular machine learning model, it is trying to predict which issue is, really, with high priority, in an objective way, in order for you to solve the issue first but leveraging these eight parameters here. So these are the predictors. And this is the factor we want to predict.
This is a real use case with a different customer to predict the issues priority level. We could also predict the model's discipline without opening the model, or if the model's name doesn't have the discipline by using the model's parameters, and by running the clustering algorithm in Python.
And we could also leveraging NLP and LLM, which stands for Natural Language Processing and Large Language Model, to interpret the text, especially like a lot of descriptions on our issues, to predict the issues risky level. Just based on the pure text information.
You also put a lot of ACC photos on our new photos module. There are already some embedded AI functionalities. But our consultant could help you to add more functionalities. For example, if we want to identify if this guy is wearing the boots or not on the construction site, if you give more definitions and constraints under this modeling, especially, where using computer vision algorithms with. ANN.
And so it's going to automatically tiling you under this picture if the worker is wearing boots or not on the construction site, without having you to tell manually by looking at it. It's going to give you a report automatically.
For trend analysis, it has a lot of use applications. For example, we could predict the tokens consumption. If you're in a EBA customer with us, you buy tokens every three years. If you want to predict the next 12 months tokens usage, like here, the orange line is predicting the next 12 month usage, you could come to us. We could do this work for you.
It's probably more related to the double exponential smoothing, triple exponential smoothing, ARIMA model, SARIMA model, those stats models. We could either do in Python or in R. Or you want to predict the labor hours based on the past five years, if you use forms, there is an area in CC forms, where you could input the labor hours. We could use those historical data to predict the worker;s hours. So for you to plan the resources ahead of time.
There are a lot of-- I could talk on and on for days predictive analytics in the industry. But if you have a specific case, please bring to us, and we will help you to analyze it. With that said, I'm going to give this page presentation back to Andrew. And he's going to talk over some closing points.
ANDREW DAVIS: Yeah, thank you, Liang. That was excellent. A really good look too into the future of the capabilities. Yeah, to summarize, I guess, how things have evolved here with Jacobs and in our environment. You saw a couple examples, I guess, today around our administrative type capabilities or customizations. And then also one around submittals and making sure our construction team has the right data and the right format.
All working with these APIs and connectors to our, in this case, Autodesk Construction Cloud connected data environment. So I think, just as a summary focusing on getting there, it's like what we were talking about back at the beginning of the presentation. We really do need to focus on managing traditional information in a connected data environment solution. Whatever that's going to be for you.
And working with different disciplines to transition to digital processes, that's a key driver, defining key data in the processes to ensure that it's captured and actually trackable, these are all the items that can be leveraged for data analytics, and support users to work with the new technologies and actually capture data digitally. There needs to be an effort to make sure that all this is happening inside of the environment. And without that, we don't really have much to start from.
I think today, from the examples we've seen, you can get an idea of what's possible, and a few cases that actually has come to pass with our team to be able to help us through some challenges. But then also, moving forward, I guess, we do need to leverage new skills with digital delivery teams. We need to look past the user interfaces and start working with solutions like the APIs and the data connection points that are available.
Seek support with vendors and colleagues or other resources that you have in your company, specifically Autodesk Consulting here expertise to really walk us through the capabilities of the tools that we're working with and really try to understand what can be done. And then we can really apply those to the solutions we're looking for.
And then integrate solutions that help users on daily tasks. That's a big driver. Getting people to adopt these capabilities and then work and generating more data in the systems, is where we'll really start to expand and leverage the data analytics and capabilities in the future.
I did want to note here, too we're seeing a lot of trend, I think we saw that at the very beginning of the presentation, around owner's interest in this type of data and analytics and reporting. So don't just look at or scope digital transformation in your own environment, design and construction perhaps, also seek to share that with clients. They're also looking for solutions like this.
So look to identify traditional delivery methods in the owner's requirements, in their specs that they give to us for delivery, and then demonstrate digital methodologies that might improve that or just change the workflows from a traditional non-structured data to more of a structured, organized data, as Liang was showing.
So with that, I guess, we'll leave it open for some questions and whatnot with the team, and we can move on to the last slide there. Yeah, well, thanks again. I really do want to mention that I appreciate the help with Autodesk and this journey that we're working through. And solving some of these small case study problems that we have here, they were big tasks. But you can see, I think. And I think the team here at Jacobs has seen the ability to really leverage these solutions in much bigger ways in the future. So thank you.