Description
Key Learnings
- Discover the ROI of operational digital twins for the water industry.
- Learn about collaborating with operational analytics and AI for problem-solving.
- Learn about your journey to adoption of water digital twins.
- Learn about integrating your multiple data models for use across your organization.
Speakers
- Nathan GerdtsNathan Gerdts is product manager in the Water Infrastructure team focusing on Water Distribution products. Nathan has lead implementation projects and advised in sales with Innovyze for 9 years, spanning model building to real-time operational modeling.
PATRICK BONK: Hello, and welcome everybody. Our industry talk today is around digital twins, and our platform with Info360-- and how the models are used for the past, present, and future state of networks and plants. I'm Patrick Bonk, the Product Marketing Manager globally for our operational analytics and the AI solutions. Eland, did you want to say hello?
ELAND AFUANG: Hello, everyone. Eland here. I'm the Solutions Engineer Lead for Asia-Pacific.
SAMI CHAUDHRY: Hi, everyone. I'm Sami, Solutions Architect for Imagine AI, APAC team and Innovyze.
JAVIER CANTU: And I'm Javier. I'm the Global Solution Architect for artificial intelligence.
PATRICK BONK: Thanks, guys. So as we're getting started, the presentation during this event may contain forward-looking statements about our outlook, future results, and related assumptions, total addressable markets acquisitions, products, and product capabilities and strategies. These statements reflect our best judgment based on currently known factors. And the forward-looking statements made in this presentation are being made at the time and date of their live presentation.
So during our session today, our objectives as a team for this industry talk-- we're going to journey through the Info360 platform solutions for ops analytics and AI. We'll set some context upfront on our portfolio and the significance of the Innovyze and Autodesk combination.
We're going to look at legacy problems and evolving modern problems within the water industry, align industry standards with the definitions around digital twins, and look at model combinations to solve problems and achieve actual project outcomes. We'll journey through these model use cases and how it's done in our software. And we're also going to look at a new product here that we're launching at Autodesk University.
So just in short, for more than three decades, Innovyze has been a leader in software and analytics for the water industry. And we've been helping over 6,000 customers in 60 countries to plan, design, analyze, simulate, operate, and maintain water infrastructure.
In March of 2021, we were acquired by Autodesk for $1,000,000,000, representing Autodesk's incredible commitment to building a more sustainable world through end-to-end water solutions. So that leads us to the Innovyze plus Autodesk portfolio and the significance here.
So with Innovyze solutions, Autodesk now cover the entire water cycle, from rainfall to ocean, ensuring delivery of clean drinking water, safe sewage, protection from floods, asset management, and risk mitigation, and optimal water infrastructure performance. So all of these stages involve the general project phases as listed below, from planning to real-time asset management and maintenance of your build infrastructure. So Innovyze and Autodesk, we now touch all of these areas.
So let's get into it with digital twins and digital transformation. Digital twin models are reaching a point where they can be deployed even in operational use cases-- so where accuracy, timing, and speed matters. So for this session, we'd like to look, and we would like to invite you to identify a problem you're currently focusing on and attempting to solve in your day to day.
So from the beginning to the end of time, utilities will always be addressing the degradation, capacity, and operational effectiveness of their assets. Yet, the state of the assets themselves have reached a critical point. In America, where pipes are laid more than 100 years, are still in use, a water main breaks every 2 minutes, leaky pipes lose 4.6 billion gallons of drinking water every day. So that adds up to 1.7 trillion gallons per year, or 7.6 billion of treated water lost annually.
So to add to these legacy challenges, data collection in the water industry is increasing exponentially. So not only does the industry have to address these legacy challenges, they also have to look at the emerging and evolving challenges with the immense amount of data being collected. And this is, as I mentioned, only increasing exponentially.
So utilities rely on a wide range of data sources to make important business decisions. And so depending on the stats you read from various industry sources, only around 5% to 20% of data being collected is actually being used for decision-making purposes. And then further research shows that about 20% of an engineer's time can be wasted looking for data or duplicating their efforts.
The data being collected is our opportunity here really to contribute. So from a digital twin perspective, as a company, we love data-- AI, machine learning, or models. We love data. So this is a good thing. So raw data, software systems, and workflows must be combined to maximize the effectiveness of their on-site operations and management teams. What we're building out here is a platform with workflows that give you your attention and your focus back.
And if you think about it, there's only so many trained mathematicians and data scientists in the industry. So now any role with our platform and skill set can benefit from plotting a data stream, cross-referencing a location of interest in the context of the role and workspace individually relevant to their greater team and organization.
So just quickly, a question our team would like you to contemplate during this session is-- how far are you from deploying a digital twin in the operations space? So we'd like for you to keep that in mind through the session.
So to introduce Info360-- or a lot of you may be aware of some of that applications associated with Info360. We wanted to cover the platform with you today. So it's a cloud solutions that offer a pathway for utilities on their digital twin and transformation initiatives, giving teams workflows they need to operate, maintain, and optimize their assets across the water cycles.
So customers can gain the benefit of collaborating across their organization with these bidirectional workflows to support capabilities around things like compliance reporting, performance management and monitoring, workforce onboarding and training, and data standardization.
So because this is an industry talk, we wanted to speak to you with industry definitions. And so here's SWAN's definition of a digital twin. If you're unfamiliar with SWAN, they're the Smart Water Network forum. They're a prominent global non-profit organization, and they work on accelerating the adoption of digital twins within the water industry.
So a couple of things to highlight in this definition is, the purpose of a digital twin is to enable insights and to drive actionable and improved outcomes. So a run through today looks at how a digital twin technology has been adopted to meet the day-to-day needs of the end user and their workflows.
And then furthermore, SWAN's digital twin architecture is an industry standard and reference point to align the key components of a digital twin. So referring to the architecture here with data analytics layer, you will see the model types diagram from other key elements of the architecture-- so from data integration through to analytics, where we have data-driven plus physics-based models, through to visualization and user experience, and then how the user interacts with the software.
So whether that's-- the team you're on is engineering services, operations, response cruise, management, and executive-level decision-makers, they all have their unique way of interacting with the digital twin platform.
So to articulate how best to align the core concepts of a digital twin is to actually contemplate the time horizon in which you're solving your problem and relate to the actual problem objective you're looking to solve. So as outlined in this figure, digital twins are mature to a point where predictive alerts actively support predictive responses to planned and emerging incidents.
And referring back to that digital twin architecture, the concepts discussed here were inspired by much of the talk in the industry around the trade-off between these data-driven and these physics-based models. And with a data-driven model, it can be anything from applying a simple variance mathematical function to determine, say, a threshold of exceedance on a single sensor-- and then progress through to predictive AI with AI recommendations and how the operators run a given asset.
So the question here is, if AI is so advanced or has so much potential, then why would you use a hydraulic model that simulates based on the physics of the system? So essentially, sensors aren't available in all parts of your network. And with a live physics-based model, you can test the operational strategies before you implement them in network-- and so that the platforms now enable the model types to share the workflows where they're used to supplement each other and mitigate those trade-offs.
So I'm going to hand it over to Eland now who's going to discuss Stantec and Water Corporation use cases. So Water Corporation is out of Western Australia. They're our main utility there, and we're going to discuss how they've used the digital twin platforms.
ELAND AFUANG: Thanks, Patrick. In this use case, we have Wellington Water that digitizes preventive asset maintenance with their consultant Stantec. As part of its digital pump monitoring study for preventive asset maintenance, Stantec utilized an operational analytics application for their work on continuous monitoring and analytics on pump performance over time.
Metrics, such as head difference versus flow rate, pump efficiency versus flow rate and shaft power versus flow rate chart. Info360 Insight is the data modeling application with workspaces open to end users' expertise and unique data needs to inform operational and network performance decision making.
So what are the outcomes? Stantec estimates that savings of up to 8% on energy cost may be achieved by responding to the significant drops in pump efficiency via targeted pump refurbishment and another 5% just from improving pump sequencing within its operations. The figure on the slide is a scatterplot of pump efficiency versus flow rate as calculated within the data analytics software used by the consultant.
So overall, the digital project delivery would be to make sure that Wellington Water will be looking into scaling the analysis to over 300 pumps. They would want to continue to utilize Stantec expertise for the next pace of this works. This includes typical actions listed according to a Stantec's approach-- for example, pump overhaul timing based on actual condition, energy consumption, and cost of refurbishment or replacement.
The other one is preferentially operate the most efficient pump combinations and speeds. Another is to identify pump problems-- for example, suction, discharge, recirculation, blockages, leakage, and others-- to be able to do pump replacement that ensures the most efficient pump is selected to suit the measured pump station, system curve, as like for like replacement may not be the best option.
Also consider it is the frequent on and off switching or speed control to help clear blockages from the impeller, impeller trimming, as well as to reduce frictional losses in the pipework. And lastly, the ability to provide a time off day in terms of the pumping. So I'm just going to quickly show the workspace of the software user interface. Next slide, please, Patrick. Thank you.
So the main intent of the Info360 Insight analytics application is to be able to provide a purpose built with KPIs and custom analytics to minimize energy cost, water loss, and leakage. This also provides a unified incident dashboard for easy handover between operations and at a glance view for all the stakeholders. And lastly, the ability to report and communicate pertinent and justifiable decisions from starting with the application and moving forward.
So now, I'm going to transition to an operational model with the next use case. And this is going to talk about how you can perform present what-if scenarios hydraulic analysis at the level of the asset.
In this use case, we are presenting on behalf of Water Corporation. Water Corporation is the second largest water utility in Australia, servicing about 1.3 million homes and businesses, about 2.3 million customers.
The main objective of the utility is to have their integrated water supply scheme to be able to reduce leakage in terms of proactive approaches to manage pressure and flow across networks. And this is by reducing the frequency of leaks and bursts, reduce water loss, customer interruptions, and the potential for extending asset life.
The project is part of their smart water technology use case where in Water Research Foundation. It's the main research foundation that provides the funding for this particular project or initiative. So the project contains digital twin-- combining the data-driven model and the physics-based model, which was articulated by Patrick based on that SWN digital twin architecture.
So what we did is, we deployed a platform that includes a performance dashboard and a live hydraulic model covering four large areas-- DMAs or PMAs. So I'll cover the two components. First one is the data-driven model with performance dashboard. It is intended for driving asset management decisions. The dashboard effectively query field and skated data to determine key performance indicators, such as minimum night flow and infrastructure leakage index.
The calculations provided deep insights on seasonality, demand changes during the day, and different seasons of the year. The dashboard also informs which areas are performing well or poorly than others-- which allow Water Corporation to prioritize and target their active leakage control activities to reduce non-revenue water.
And finally, for this component, analytic approach to burst detection was also been field tested, and this is by simulating a pipe burst, by opening a hydrogen while targeting flow within the 10% of the minimum night flow. Results indicated that the burst detection approach was relatively accurate, and this will potentially help to detect more leaks in the network.
The other component is the live hydraulic model, which is a physics-based model, the second half of the digital twin. It is intended to model what-if scenarios, such as the testing of further pressure reduction within the PMAs for the short term and more of contingency planning in the long term.
The live network model has built in demand forecasting using machine learning. It performs demand forecast for the next 24 to 48 hours, which is very relevant for operations, fully integrated with reliable hydraulics and SCADA data for predictive modeling.
For the short term, the live hydraulic model was utilized to validate and perform model calibration and for proactive pressure management. The latter is by simulating pressure reduction within the PMAs by further dropping nighttime pressures to reduce background leakage using pressure regulation and flow modulation control.
And for the long term, there's an opportunity to assist operations more seamlessly for contingency planning, wherein an operator can model incident scenarios and manage the model directly. This, in turn, will further accelerate incident response in the field.
So to summarize, Water Corporation has learned how to create and maintain a digital platform, and that is by combining a hydraulic model and the SCADA base telemetry data to improve operational and asset management use. Data is collected continuously to quantify the leakage reduction, where in the live hydraulic model, it's promoted to the operators for planned maintenance works.
This will require the establishment of a workflow where the live models are created and maintained within the planning group and then hand over to the operations. They will get trained and use in the software to assist them in their daily activities. Similarly, to provide a quick overview with Info360 Insight, we have here the incident manager workflow where we can document every step in terms of responding to the particular incident.
We have a hydraulic model fully integrated with SCADA and time series data and the ability to perform incident management by simulating, for example, a pipe burst and showing the number of affected customers the extent of the incident-- as well as to be able to perform mitigation and operational response and that is by isolating the pipers to make sure that the number of customers affected are minimized.
So I'll pass over to Sami to provide the use case for the future predictive recommendations for optimization.
SAMI CHAUDHRY: Thanks, Eland. So I will be presenting Imagine AI, which is a cloud-based platform offered by Innovyze. And it is artificial intelligence-based tool that helps optimize processed plants and networks. And the capability of this tool is to predict the plant operations and also give recommendations for improvement in the chemical dosing and energy consumption of the processed plants and energy consumption of the distribution networks.
So we have to discuss the tools that have been described by Eland which are focused on past and present. But this tool is based upon prescriptive analytics. And prescriptive analytics means that we utilize model predictive controls to predict recommendations for optimizing the plant parameters.
In prescriptive analytics, we use the models which are based upon the historical data of the plant and develop based upon the plant's historical operation, and they are trained to predict based upon that historical data that what the plant output will be in next few hours or whatever the prediction horizon is selected.
So the main deliverables from this predictive-- our prescriptive analytics are to reduce energy and chemical OPEX for the plant, and improve network resilience, and increase the transparency of complex operational decisions for the networks. The areas of application for this tool are wastewater treatment processes, water treatment plants, and water distribution networks.
So in the utilities world, wherever there is chemical dosing involved, or wherever there is energy consumption involved-- like for the pumps, or, say, for the blowers in the bioreactors, or the pumping networks where different pumps have to work in combination to meet the demand of the network. But we need the optimized schedule of the pumps to get minimum kilowatt hours per micro-liter cost for pumping around the network.
So this is where the application areas are. And how this tool works is, the picture on the right shows that how this data-driven tool works. We get data from different sources. It can gather data from different sources, like SCADA, CMMS, or lab information management system. And all of the data that is relevant to predict the plants performance, they are collected.
And this data is sent to Imagine, which leaves in the cloud. And this data is then analyzed by Imagine's machine-learning models. And the predictive recommendations or the predictive parameters of the plant are generated. And then, those predictive recommendations are given on a dashboard to the operator to implement.
So there are two parts of prescriptive analytics-- one is to predict what will happen based upon what the current conditions are, and then, not only to predict, but also to prescribe what should be done in the present to make what we want to happen in the future. So we'll see how we move to the next slide that how prescriptive analytics helps making it possible to do what we want to happen for a particular plant.
So this use case is related to a sludge de-watering system, associated with a pulp and paper mill. And the main objective was to optimize chemical consumption and also optimize the quality of the sludge that is being used as a biofuel for steam generation in the downstream. So in the downstream processes, these biosolids complement the energy requirement, which is otherwise being met by using oil to generate steam.
So improvement of quality of this sludge improves the moisture content of the biofuel, which, in other words, reduces the energy that is needed to evaporate that extra water within the slides or in the biofuel-- and improves the energy delivered to the system or reduces the energy wasted in evaporating that extra moisture.
So the main objectives or KPIs that we wanted to achieve were to optimize polymer dosage and sludge quality and optimize energy usage by improving steam production-- by maintaining and improving sludge quality or sludge consistency.
So in this case, the polymer dosage ratio that we see in the table on the right-hand side reduced from 0.019 to 0.18. And the overall savings was 0.001, which is around 5% savings in terms of chemical dosing.
And the sludge dryness improved from 33.42%-- or the sludge content, the dryness or total solids, improved from 33.42% to 36.64%, which was around 3% improvement in the moisture reduced from the sludge-- which meant less energy wasted in the process.
So the polymer OPEX which was around $230,000, overall reduced by around $12,000, but the main savings were in the oil, which was use was around $1,000,000 that was being used. And around $99,000 could be saved as per our evaluation.
And the approach for this optimization was, again, to get the plant data from the history and train models to predict what will be the sludge quality under different operating scenarios, and also come up with the most optimized dosage rate for the polymer to achieve the best outcome for the sludge dryness-- and in this way, improve the overall process.
So this is one of the typical areas. Not only this is an example from the pulp and paper industry in South America, but its similar sludge diverting systems are also used for the wastewater treatment plants and also for coal generation plants which are associated with wastewater treatment plants.
So this next slide indicates that how the current operating set points which are in black, indicated in the black bars-- so this is what is operating right now. And the light orange bar shows that what is the recommended or what was recommended in the past compared to that black operating parameters.
The bright orange part is the near future recommended value by the platform. So it tells that part you have to maintain in next one hour or 2 hours, whereas the blue lines are showing far into the future that how the system is going to behave and what the future set points will be if that orange bar is followed.
So this particular prediction page is related to a pumping sequence, where it is telling what is the current set point for the bumps and what will be in the next one hour and the following hours if the recommendations in the orange are followed, how the system is going to behave in the next-- it goes up to 24 hours in this particular scenario.
So this is one typical example of the Recommendations page, where the software is telling the operator how to operate and predicting as per the prediction horizon of the model. Yup, next, Patrick. So this tilting between maturity level-- as we had seen before that, we have seen the digital twins of the system that help with the past and present.
And this prescriptive analytics or that's what we say in the evolution of the digital twin maturity, it's on the top. And we'll see in the next slide that how prescriptive analytics is in the top of the evolution of, say, digital solutions. In this particular example, the picture that we are looking at is a typical example of how all the AI-based platforms help make decisions for the operators.
In this particular example, say, it represents a process operation which is bound by these gray lines. All these are the bounds or operating envelope. And in the center, we can see there are two circles. One is the big yellow circle, and one is small green circle. On the top right corner is the point which is the most optimum or economic optimum point for the plant to operate.
So based upon different operators' training level and skill set and experience level, they may operate or they may have their own risk appetite. So they may operate far away from the bounds. And overall, the operators may end up operating in this big yellow circle. And a lot of part of this big yellow circle is away from the true economic optimum.
Whereas if these operators are given recommendations or prescriptive recommendations by an intelligent tool like Imagine or AI-based models, they already have pre-cooked instructions to operate the plant in a particular narrow range that we can see in that green circle, which not only ensures that it is away from the bounds of the operations, but also ensures that it is getting the best economic outcome for the plant as well.
So basically uniformalizes the approach of operation for the operators. So all operators, whether they are experienced or inexperienced, or not very well trained, or well trained-- all of them can get a uniform approach or decision for them to make. So it helps-- this tool helps for the operators to make quicker data-driven and accurate decisions, which result in most economic outcome. So that's the description of this tool's capability-- how it can help the operators.
And if we look at the analytics evolution, as we see that on the top left bottom corner, there is data. And if we are able to see that if we have data, we can see that what happened in the past, that is called descriptive analytics. And if we do dig deeper, and if we are able to find why did it happen, that is diagnostic analytics. So what happened? Why did it happen? So we are on the diagnostic analytics ladder.
And then, if we are able to see what happened, why it happened, and are also able to predict what will happen in the future, then we are on to the ladder of predictive analytics-- where we are capable of looking at past and also being able to convert that learning from the past or the insights from the data that is there and the history to learn what will happen or be able to predict what will happen in the future.
And then if we are able to predict what is going to happen in the future, we should-- if we are able to optimize in a way that we can make what we want it to happen, then it is called prescriptive analytics. And that's the part of optimization on top of predictive modeling.
So that's where Imagine or the tools like-- or AI-driven tools can help-- that it can not only predict what will happen, but it will also suggest or prescribe what should be done now to make what we want to happen. So not only predicting the future, but making it happen-- so that's in the evolution of the analytics. It's on the top, and that's where the AI-driven tools like Imagine are in terms of analytics evolution. Yep, I'll hand over to Javier to go through this one, or--
JAVIER CANTU: Yes, thank you very much, Sami. I think that's a very important story that you just told in regards to the pathway to optimisation. It's a phased approach. It takes time to get to there. But before you can get to a spot where you're asking-- let's say, your artificial intelligence-- hey, how do I reduce my cost for running my system, or anything like that? There are certain elements that need to happen first.
So as Sami mentioned, we've got descriptive analytics, prescriptive analytics-- predictive and prescriptive, right? One can't happen without the other. And it takes time for a facility to prepare themselves-- so not just a facility, but also an organization. APM solutions providers-- APM being like asset performance management-- gives a lot of that basis for like, hey, how are my assets performing? Can I start diagnosing problems or anything like that?
Those predictive tools, it's really getting into the details of machine learning. Can I forecast out something in particular? And prescriptive like Imagine, it's already working its magic and reducing costs for our customers. What we do know that is that there's been a gap there on being able to help people move through that journey.
So I'm happy to announce-- if it hasn't already by Amy either in a talk before me or after me-- that we are introducing a new product. And that product is Info360 Plant. Info360 Plant is really meant to prepare all of our existing customers, regardless of what industry they are, that have these water facilities, that touch water, to get to that stage of prescriptive analytics which is having something like our Imagine platform.
So what is Info360 Plant exactly? Well, it's part of our cloud offering, part of our Info360 Plant's offering. With Info360, we can capture the entire water cycle, and ultimately help reduce the gap and help other people in their journeys to prescriptive analytics by giving them the tools to do descriptive analytics, diagnostic analytics, monitor their performance, increase their level of service.
And overall, because these tools really become a powerful way to reduce the hours spent on certain workflows, it also works as a wonderful way that we can provide consistent and efficient reporting. Go ahead and go into this next slide, if you don't mind, Patrick.
So what can we do with this new product, Info360 Plant? I am so excited for this, guys, because well, as far as I know, last I saw from water for people, 1 in 3 people still suffer from water scarcity. So to me, when I look at why do we need Info360 Plant, I 100% zoom in into this capability [INAUDIBLE] of performance tracking.
I like efficient systems. I want efficient systems. The only way to get them as efficient as possible to be quality water at the least possible cost so that it can go to everybody, is to understand where my water is going. Develop a mass balance, where I can understand if I'm losing water.
And at the same time, see where I can optimize, whether it be an energy reduction, or waste reduction, or even something as simple as, let's say, just using a lot less chemicals, so we don't have to be dosing so much onto our waters. But also of importance, we got cost management. You want to track costs. How important is it to understand-- like, hey, maybe my facility is spending around $200 per million gallon a year. And that's down year over year 5%.
It often takes weeks, if not months, of projects and data collation to understand that about your facility, develop reports, and project out, and start planning the future for your facilities. But now, with a tool like Info360 Plant, all that can be done in a matter of clicks. It's ready to share with an organization and develops justifiable results for everybody.
We can go ahead and move on to the next slide here where I can talk a little bit more about why it's important as well. So I think I talked a little bit why I'm passionate. I can create more efficient systems, give better water to more people, and hopefully do a little bit more toward some of the problems that we have right now in the world when it comes to water scarcity.
But the outcomes particular to Info360 Plant, we look at that ability to improve that performance-- as I mentioned, the level of service. Not only can we improve that performance by utilizing these analytics and diagnostic tools, but we can find new ways to treat water better, and we could look into being able to even get better quality water at specific types of plants.
Whether you have a wastewater treatment plant running on A20 trying to reduce your phosphorus discharge into the environment, or reducing your overall chlorine feed into your distribution network for water, for your water delivery network. Of course, reducing overall process and workflow complexity-- this is another reason why we all need this.
We know that it takes time and effort to collect information, develop a report, and deliver it to our boss-- or even our boss delivering to his boss, or justifying rates for anything of that regard. What if we had one source of truth? What if those workspaces only had to be developed once, and they were automatically updated? Well, you no longer have to have those projects take forever. Now, it's a click of a button, you have them up and ready, and they're continuously updated.
And because all these things are so easy to do, we have this added benefit of being able to automatically generate compliance reports, track sustainability metrics, like emissions, waste use, energy use, and audit our systems in such a way that we can target specific sustainability goals, and just do better by the planet overall. Pat, I think I'm going to let you finish talking about what we've got here.
PATRICK BONK: Thanks, Javier. So to bring this all home, how do we adopt digital twins? And what are the main factors? We have a convergence of technology, and we want to just take this last part to convince you that the time is now to be implementing these solutions and how they're available to you.
So to recap, we're establishing a digital platform. It integrates all these model types that we've covered. Again, to align the core concepts, let's think about our time horizons as well. And instead of focusing on the problems we're solving, let's think about how the user interacts with the software relative to the combinations of models used.
So we have our past scenario, and that's our historical data, our state of the network or our treatment plant. And that informs the performance of the network using these user-built workspaces with the performance KPIs metrics. So that's on your left side.
The present, in the middle of your screen, that's the present time horizon, the present happenings. We use that to generate alerts based on the incidents occurring in our network and our true implants. And then, those can be used to generate what-if scenarios and contingency responses utilizing those particular models. And again, these are all in-built design workflows.
So if we look at our future scenario on the right side, those are our forecast and our predictions that Sami and Javier were covering, where we could look at future set points and how they're generated from the actual AI. And it's sending predictive recommendations to the end user.
So let's think and reflect-- have we answered that question at the front end of our session? How far are you from deploying digital twins in the operations space? Well, let's reiterate that a convergence of technology of cloud-based solutions has happened where they're scalable and elastic. You can simply add more data and processing capacity as you go.
We have technologies like AMI and IoT. Those are already pushing data to the cloud. So the Info360 platform easily grabs that data. It integrates across business areas, like billing and work management and planning. And then, there's also newfound accessibility to all these things, like we've covered with the workspaces, but from a mobile and remote operations standpoint.
And then, our cloud offers capabilities like these models-- the machine learning, and scaling up simulations, and things like Monte Carlo analysis. So from an adoption standpoint in getting started, we invite you to build from what you're already doing.
So individual models can be accessed from the platform for the problem you're already working on. And then, focus on that problem you're solving. You can actually start with a simple use case. And then from there, understand how the user is going to act with your digital twin on a daily basis.
And so what we've seen is, workflows exist in the operations space now, where rapid decision-making needs through to the high-level decision-making needs by executive management, the use cases are beyond just our engineering teams, and the workflows are accessible to that wider group, the wider range of personas within a utility, a council, and as well very importantly, the consulting and the external expertise serving utilities and councils.
So to close, innovation and adoption, it's accelerating from a pure capability standpoint, but also like the emergent working remotely team formations and our globally connected environments are also really leading to this convergence of factors. So really, all you need is a login to get started.
And our team, we just wanted to thank you so much for your time and attention to our session. Please make note of our names. I believe you can message us on the Autodesk University platform, or feel free to reach out on LinkedIn with any particular questions. Thanks again from all of us, and we hope you're enjoying the conference. Take care.
Downloads
Tags
Product | |
Industries | |
Topics |