Description
Key Learnings
- Learn how to implement a digital governance strategy to define standards and procedures that optimize the use of digital tools and output.
- Learn about integrating multiple source systems (BIM, EAM, GIS) into a common data environment visible via a single pane of glass interface.
- Learn about establishing best practices for creating and maintaining a BIM-powered digital twin that embeds a standardized asset ontology.
- N/A.
Speakers
- ATAracely ThompsonAracely Thompson, is the Director of Digital Solutions for Parsons innovation division, Parsons X. She develops innovative workflows between Asset Management and BIM platforms to integrate mega data into a Digital Twin. She is a certified ISO 19650 professional implementing ISO 19650 standards at Parsons. She is instrumental in moving Parsons forward in Digital Twin and BIM technology. She works alongside Autodesk to develop new tools and processes for Parsons. Aracely's key responsibilities are Digital Design management, develop BIM project work plans which encompass the clients' project goals, schedule and budget. She has thorough knowledge and expertise in transitioning and integrating an office to BIM - implementation, management, standards, logistics, software training, cost estimates and specifications. Aracely, an architect by education, has close to 30 years of experience in design, management and development of a wide range of projects. Her architecture experience has been in transportation, mixed-use, planned unit development (PUD), educational, retail, custom furniture and interiors through all phases of design and construction . She is a reoccurring speaker at Autodesk University, RTC BILT and IFMA.
- HSHoward ShotzI have 30 years of experience implementing Digital Twins, Enterprise Asset Management Systems (EAM), Integrated Workplace Management Systems (IWMS), Computer Aided Facilities Management Systems (CAFM), and Computerized Maintenance Management Systems (CMMS). Projects in ranging from less than 500,000 square feet to over 75 million square feet. Collaboration on exciting and forward thinking technology projects to provide efficiency and cost savings. Client organizations have saved millions of dollars through the implementation of these valuable systems.
HOWARD SHOTZ: Hello, everyone. My name is Howard Shotz. I am the director of our digital twin practice at the Parsons X group within Parsons Corporation. My background is as an architect. And I have been implementing a variety of digital twin, enterprise asset management, and computer information management systems for about 25 years.
The presentation today is a project we've been working on for about a year. And as you can see, it's implementing an operational digital twin at a major international airport. And part of the presentation will be walking through the business drivers, as well as the project we're working on, and then next steps as a bit of a case study.
Talk about Parsons and Parsons X. Parsons is a multinational architecture engineering construction and innovation company. And as you can see, we work across a variety of sectors, different types of infrastructure projects, ranging from federal work that we do with the United States government and others globally, as well as a variety of transportation projects, aviation projects like we'll talk about today, as well as rail, and education across a variety of sectors. And part of what we do in our Parsons X team is implement innovation of technology and process within Parsons and then within our various customers around the world.
Talk a little bit about the project and the business drivers. And a lot of what we're talking about today is how we implemented a digital twin. And as you can see on the screen, it's implementing a digital twin to provide end-to-end visibility, the idea of a digital twin as a virtual representation of a real object, asset, system, or system of systems. And part of the goal was to create that information and bring visibility into a single pane of glass, how to combine a lot of information together into a single pane of glass, and how to increase the efficiency.
And what you can see on the right side of the screen are the various digital twin concepts and drivers that came from our customer, a digital strategy and governance plan. Really, the heart of understanding how to gain the efficiency and economy of scale is to have a digital twin governance plan that works with your assets. And through these various pieces, you'll see another big component that separates a digital twin, for example, from a simulation or a virtual representation, are real-time information gathering, internet of things sensor devices, other operational systems that are feeding information, and then the ability to develop predictive alerts based on that information. So really, you're creating a representation that is a living representation. And it will provide information through this various sensor packages.
So talking a bit about the airport project itself, this project came about through a large aviation customer. The focus was on asset management, energy efficiency, and situational awareness. So what you're seeing here are the ability to understand the assets that are within this airport, there are some projects that we'll talk about coming from what is known as the NREL, National Renewable Energy Lab, and there is a project called Project Morpheus, there's also a project called Project Athena that are looking at energy efficiency algorithms.
And part of the idea of the platform was to provide information to work on those algorithms and certainly situational awareness. Can the customer have much more information about the real-time activities going on, both on the airfield, as well as what's happening within terminals, or even on the land side of the airport relative to parking, to access, various means in, ingress and egress of the passengers, and how would that work.
The main idea was to implement a digital infrastructure platform based on a commercially off-the-shelf project. And the idea with that commercial off-the-shelf product was to allow a rapid deployment without custom creation. The assets that we were working on were a newly rehabilitated runway, the central utility plant, and their recently expanded terminal. So these three assets were put together.
We started with the runway, which we believe is one of the newest runways projects or the only runway project to be digitally twinned, that we know of. And along with that, you're seeing really that NREL capability as well as the integration with automation systems, such as building management, a variety of sensor information, as well as enterprise asset management, EAM systems. And the ultimate goal of this project was to optimize the operation and planning of the airport in a variety of ways.
So what is an operational digital twin? What you're seeing here is a representation of that single pane of glass. And the single pane of glass can combine spatial information from a variety of sources, such as AutoCAD certainly, and BIM, it could be AutoCAD Civil 3D, there could be GIS information from other systems, as well as a variety of static information that may come in, such as scanning information for example.
And what you're seeing across the bottom are a variety of systems that were brought in, such as runway information coming from AutoCAD Civil 3D, GIS information that was brought in, Part 139, which in the aviation world are done for airfield inspections from the FAA, runway sensors, so weather stations and runway sensors that are embedded within the runway, which provide information, the enterprise asset management system, and then 3D certainly, which was done for this terminal, Johnson Controls Metasys information. And then there's a combination of EAM systems for a variety of reasons, some for the gates, some for baggage handling, some that were done for aviation partners, separate and distinct. So a lot of different information was brought in and combined into the single pane of glass concept.
And you'll see across the right here, in these black boxes, what were the additional drivers. It provides visibility into both historical information that's brought forward in a static way, as well as the current information that are coming from the assets, both those Internet of Things, IoT readings, as well as information that's coming in through work order ticketing, inspections, a variety of things.
And then building the knowledge base. So the idea is that there's various systems which are all continuing to be operational. The idea behind the digital twin was not to replace these systems of record across the bottom. It was to lightly interact with them in ways that wouldn't impact or impede their progress because they are production systems that are viable.
And they're very critically important to the airport. But it was really to bring that together so that you could have a really cohesive picture of what was happening in that integrated solution. And then, as we talked about, the performance and efficiency, can the owner/operator see what's happening from one pane of glass versus having multiple different screens, multiple different systems.
Part of the GIS integration, as well, was to provide-- this is an earlier view. And this is a five-year project. We're just coming through year one. So we had some of the earlier proof of concepts that we're seeing a bit of here, which is a GIS integration, where we're looking at the runway, we're looking at the whole airfield in different colors, and we brought in the Esri AGOL information that was provided. So you're seeing that kind of integration as a start to build the platform from a highly interactive data source.
Part of the runway, as I mentioned, where we think this is a very unique capability, is there was an initial AutoCAD Civil 3D set of drawings. We modeled those in them and also brought in a variety of information. So what you're seeing is a composite picture that is inside the twin. And what it allows is a view of the various slabs that are created, both on the runway, the touchdown zones, the aprons that you're seeing, as well as runway markings.
In the distance there, you'll see some signage, which we'll talk about, as well as runway lighting. So it's really the major critical components for the runway have been brought into the twin. And then again, that comes from a variety of systems so we're able to really see visually what's happening. And we're also able to look at the information.
So I talked earlier about the weather station information. So what you're seeing here is a model of the weather station. And inside the interface, you're seeing, in the left panel, what we call the ontology. So that left panel is how that piece of information, or that asset, I will say, where that asset sits. So it's a piece of equipment, it's a sensor, it's a weather station. And we can categorize that with unique ID that is now available to all systems.
What you're seeing in the middle panel right above the 3D image are the information coming from the temperature sensors. And there's a variety of sensors that are coming in, such as temperature, resistance, humidity, surface temperature, subsurface temperature, for example. And these are critical to the operations. And they're now available through the IoT integration into the twin.
And then, what you're seeing here, in terms of analytics, is a pretty straightforward time series where the information is coming in, in this environment, certainly whether it is temperature or there's ice or water, water film thickness is certainly an issue which affects the ability for takeoff and landing. That is something that is in the system and then users can access that from their desktops.
The other piece that we're showing is a deeper dive and a longer time series of that information from the weather station. You're seeing, on the left, the legend of film heights, temperatures, conditions, friction, and what's happening with that. And you're seeing, depending on that time, if you have a weather event, whether it's weather or temperature or maintenance, for example, you're seeing the sensors react. And that's now available in time series information that's being tracked coming back in at whatever increment is decided. It could be every 10 minutes, 15 minutes, that can be brought in at different frequencies. And then the graphics are provided to the users.
So this is a pretty important part from an airfield operations perspective, lighting and signage. And what you're seeing on the left, similar concept, it's the ontology. What we did in this project is we created an airport ontology, which is available publicly through Microsoft. And that ontology allows all users to organize assets from an airport perspective in a way that can be represented across the industry. So we put that first set of ontology out there in the public domain.
You're seeing a 3D representation of that object where it sits. And then, on the right, where this becomes also very important, is the ability to see multiple sources of information at the same time. So the twin is bringing in multiple systems. You see there's a lot of metadata that the systems carry. And now they're able to interrogate that information, both in a passive way and in an analytical way, depending on the algorithms, of what's available within those various pieces of metadata. So this is a beginning stage of that single pane of glass and the analytics that are happening.
In addition to the runway, we mentioned the central utility plant. So the modeling that's happening for the central utility plant is bringing in the core elements into Revit. The Revit models are ingested into the digital twin to provide the spatial component as well as the core relationships. And then what you're seeing on the right side of the screen are the IoT sensors coming off of Johnson Controls and Metasys.
That information is going into the twin. It's being tracked. And that's providing that similar time series information. So in terms of the users' understanding and training, pretty easy interface to understand and learn. It has a repetitive visual aspect user interface and user experience, which helps users work across different assets because it works in a similar way. And again, you're seeing the time series data for that cooling tower fan. And that information is then brought in through the integration.
And then from the terminal, what you're seeing on the left side, is a massing model, a layered masking model coming from the BIM side. And that, again, is brought into the model that users can drill into that to whatever depth that they need in fidelity. On the right side of the screen, what you're seeing is a combination of the IoT integration, what are called [AUDIO OUT]. So in the red, you're seeing 23 insights. And they are essentially coming back, whether they are fault detections, which could be coming from temperature, they could be electrical faults, sensor issues, whatever information is coming back, that's being ingested into the twin based on the parameters that are established by the control points or machine learning algorithms, depending on, certainly, you need to build the data set. And that is then going into the twin and it can be analyzed.
And what you're seeing in the blue are the tickets and then other pieces that are available. So depending on the number of insights and tickets, that will then allow the users to create tickets in the work order system or tickets are created in the work order system and brought into the twin. It's a bidirectional capability. And so what you're seeing is the systems are reporting, the digital twin may automatically start creating tickets that can be sent over to the EAM system or vice versa. And again, you're able to prioritize, users can see everything that's happening in one interface.
Jumping inside the terminal, specifically in terms of HVAC, so I mentioned earlier the project Morpheus that's coming from NREL. And part of what that project is based on is the ability to receive information from the building management systems, to put that information into the twin platform, and then through an API integration, have the ability of NREL to develop algorithms and those algorithms to pull the information up, process that information looking for efficiencies, looking for optimization patterns, and then push that back for analysis, or actually put that back as results into the twin. So that's a lot of what the idea behind the initial stages of the project is about is providing this basic information about the asset's performance for NREL to develop it.
Similar interface, again, what you're seeing, left panel, is the ontology. It's an HVAC equipment. You're seeing that unique asset ID in a common format. In the center is the graphical information about this air handler and where it lives. So it can also be brought in using the Forge Viewer technology from Autodesk.
And then, on the right, what you're seeing is, again, a comprehensive set of metadata that is now available for the users, which is very detailed. And then, part of the idea of the data governance is to see the various systems of record, to start to work in a data governance platform so that information supports that common data environment, the CDE. And then the source systems will start to streamline. This is, again, in early year one. So part of what we're doing is doing gap analysis and data analysis of the different systems. And as we move forward, the systems will start to align themselves even more over time down deeply into their various data sets.
So this is a time series. So what you're seeing in this is coming from the air handler. The high speed energy sensor in orange, at the bottom, as well as the power sensor, which is in green. So what you're seeing in that IoT integration or time series information, what's happening with that particular sensor.
And as we're mentioning, this provides analytics, control points. It will send, depending on how the system is set up, it can be sending tickets off, and it's accumulating information and also interacting with the algorithms that are provided depending on the analytics platform, which multiple analytics platforms can be used depending on the various user needs on the type of implementation that you're looking for.
So talk a little bit about how this was put together from an implementation and integration standpoint. So digital governance standards, again, are very critical to the success of a twin project. You don't necessarily need to have them fully in place to start, but it's pretty important that throughout the journey of implementing there is an understanding throughout the organization and the various stakeholder groups that digital governance is a supporting feature. And moving forward with that, you'll want to talk about that within your organizations.
Cloud-based platform, so where we're working was a commercial off the shelf system. It's cloud-based and it allowed for rapid deployment, support, scalability, and it was also built with the capability for integration points, some that were existing, and then new integration connections could be built by the implementation team. Also, the ability to link GIS, CAD, BIM, and other systems into that.
And then, what you're seeing in the diagram, for example, on the left of that diagram are the spatial information, such as CAD, such as GIS, such as BIM, and other mapping information, for example, static information. So certainly organizations are going to have a variety of static information from systems, could be historical information, it could be current asset registries, and that needs to be brought in, and then live data. So certainly, the power of the twin, as I mentioned, is its ability to ingest live information, preferably through IoT type of integrations in API. It could be in batch systems that might happen in a real-time approximation, depending on the maturity of the source systems.
However, the idea is really to bring that into the core platform. And then the core platform has a couple of different components to it that you see the twin interface, that single pane, data intelligence and analytics, and then that marketplace, which is the idea of integrations. So that's really the key components of a twin platform.
And then on the right, what you're talking about are various ways that information can be sent and provided to the stakeholders within the organization. So it could be the development, the planning, from a design, construction, renovation perspective, so really, design and construction planning, operations, which is the focus of the twin we're talking about in this example, which is really how to optimize what is happening in the various parts of the portfolio and the airport, and then the experience.
So what you're able to also support with a twin, and that would be the evolution of what we're looking at with this organization, is how can you support the passenger experience; how can you understand the planes that are arriving and when they're arriving; who are the passenger bases, for example, are they coming from Disney World, are they coming from Europe, are they business travelers; and how can the airport, for example, adjust how it responds to those various groups and concessions in where they're brought into the airport for the easiest ability to move through the airport, all those kinds of things.
Talk a little bit more again about the implementation process. So from a technical implementation standpoint, this is a relatively straightforward concept for an IT project, six steps that we move through. Step one was really the information gathering, starting with the gap analysis and uploading the information as available. You're seeing in the spatial data set, taking that tabular asset information that we received in step one, combining it with the spatial data through GUIDs, whatever other information was available so that we can marry graphical information to the static sets.
And then in the preintegration step, what you're seeing there is really talk about the existing documentation and integration. Through steps one and two, we have a platform. That platform is somewhat static. It's based upon the information we received to date. And that could be from latest scans, it could be from BIM models that have been built, or CAD models and GIS. But at this point, it is a representation, it is a static representation.
Moving forward into steps three and four, we're really looking at the integration and how we bring that information in and how we send that information back to the source systems. And that's going through steps three as well as step four and how we build the connectors, the live data, and the systems. And the truth is that they don't necessarily, in step four, for example, these existing systems might act differently through a live integration. And they might have different types of data that you see in a live integration than what you might see from a static export in step one.
So the goal of the twin, by step five, is static information you received in a batch file or folder information, spatial data that you've received, which again may be current, it may be slightly older, and it may have varying degrees of completeness and accuracy. You're then working through steps three and four to marry up and try to create the most accurate picture of what's there. And that's, again, part of the governance. By the time we're getting to step five, that's what we we're showing earlier, those analytics capabilities, can we send information up to analytics packages, can we build control points and personas into the workflows that are starting to really enable the users to understand what information is coming back and to act on that information.
And certainly, the idea of all of this is to share that information and to provide knowledge transfer so that the airport themselves understand how to use the system that is being implemented, how to support the system, and then certainly have an understanding of what they can do next, what capabilities can be added, what are the limitations of what can be done for a variety of reasons, such as, again, the accuracy and completeness of information, the ability to connect to other systems. For a variety of reasons, both technical as well as contractual, there's all kinds of different things that can come together in a large implementation project.
So within the implementation activities, we really focused on the digital strategy and governance, enhanced the strategy to use the digital data and capture the information coming from current projects, current as build information, all the different things that were available. There was a BIM interoperability and Dynamo Model Checking phase. So the idea was to develop from that digital governance strategy what would be the acceptable standard to be able to provide a baseline that as information is brought in, it can be checked, it can be determined if it's viable, if there are things that need to be updated within the metadata or within the spatial data and configuration so that the twin has a high degree of accuracy and it can be really relied upon by the airport going forward.
And another big component was security. So certainly, in today's day and age with cyber and other things that have happened over the past several years with hacking, it is important to put together a system, both from the initial commercial off-the-shelf product to the cloud system that it works with, to the integration concepts and how that information is ingested and sent back to the systems. It's important to match up with the latest cyber standards that have been brought forth, whether that's coming from the federal government or the local government or the customer themselves.
The other piece was installing the platform. So as I mentioned, we stood the platform up and then started to load information in from the runway to the central utility plant, the terminal is ongoing, and expand the square footage of information that's within it, and then provide that training. So simple, straightforward, conceptual implementation. And we've been working through that over the past year. And that will continue.
So this is interesting. These lessons learned came from the customer. And I think it's important to, as we've been at this for a year, to see the customer's perspective. And the first one you see is 3D is expensive.
So it is not required to start off a digital twin by having everything scanned, everything in BIM, everything completely done all the way to the highest LOD. 2D can work. 3D at a lower level of fidelity can work. The important part is that the information gathered meets the user requirements. So that is one of the early criteria of moving forward is to understand what is available and what can you start with so that you're not essentially waiting for tomorrow when you could be starting today.
Realistic expectations. So defining realistic expectations is certainly very important. This is where there were expectations of what can be done. And over the course of the year, there was an understanding that, well, a lot of this is very dependent upon the information that we have, the quality and quantity and accuracy of the spatial information, the accuracy of the source systems. And we need to set an expectation for large implementations that this might be a multi-phase process.
And depending on how we go, we might do this in an iterative way with a low level of fidelity for the first iteration of the digital twin. And we will add more detail to it. Or we're going to start with a smaller square footage, not try to do everything at once, and put a lot of information in. But we will slowly increase the square footage over time with multiple systems, so a variety of ways to move forward. And that really needs to be worked out either through a proof of concept or through a bit of upfront conversation.
Certainly, number three, there are a lot of existing systems that need to be connected in terms of developing the power of a digital twin. As I mentioned, it is the distinction, from our perspective, of the difference between a twin and a representation is that integration. Those systems might have different data sets, they may or may not have the same asset coding, there could be contractual issues that preclude other vendors from providing proprietary information, or they might simply not be able to connect for a variety of reasons, depending on security or the technology that's being used. So going in early and understanding what can be done and when is certainly important.
Here's another interesting one, which is part of a conceptual shift from the customer, to work digital twin implementations as an activity within the capital construction. So as construction projects are built, similar to the way that we've done this for years, which is defining the as builts and defining the technology that information is provided to the customer at the end of the handover process of construction, for example, is now establishing a subset of requirements for digital twin, that that work will be delivered in a way that can be both brought into a twin and possibly even standing up a twin, the technology and the implementation as the construction project, so that when that's completed, it's handed to the customer, which I think they're realizing that there are advantages to blending that into the capital side.
Implement an integration platform. So this one leads back to that one about step two of beware of legacy. So in terms of implementing, there's what's known as a point-to-point integration. That's where the digital twin talks to each system separately, might talk to the information management system with one connection, it might talk to the enterprise asset management system with a different connection, it might talk to the building management system with a third connection. So that's point-to-point. That is good in some ways. It also means you have multiple different connections that do require a certain amount of maintenance.
In an integration platform, an integration platform is sort of like a data lake or a data warehouse capability where the customer themselves, or a vendor, have brought in these various bits of information, and then the digital twin platform, and other platforms, can integrate with that single point. So what's happening, from the vendor perspective, is they're simply asking the integration platform for information or they're sending information back to the integration platform as a single connection. And then everything else that's happening in the integration platform.
It's a simpler architecture. It does require that the customer or the vendor supporting the customer have an integration platform. That's a key component. If there isn't an integration platform available, then you will need to probably do point-to-point. And then, essentially once the integration platform comes online, at some future state, you can remove the point-to-points and move them into a single integration. But a key point.
Then it is sensors. So sensors are great. However, they do need to be connected to something within their system. So for example, if you have a parking sensor or a stall sensor, occupancy sensors, light sensors, all of those sensors are probably connected to some control point. And then that control point and the system that supports it do need to be connected. So similar to that previous point about an integration platform, is the power of IoT sensors, is there connectivity. And that is a process that does need to be managed and understood early, especially in a construction project, so that sensors are built and placed and connected and the information is then made available to the twin platform.
The last two, again, super, super critical-- data quality and data governance. So the whole capability of analytics is built upon the data quality. Is it current, is it complete, does it span multiple systems. If you don't have a good understanding of the assets in the field, thoracic tagging, for example, and barcoding or QR coding, if there isn't a lot of metadata built into the models, whether it be BIM or other, you might have a limitation of what the twin can provide.
Technically you can provide it. But if the data isn't available or it's of questionable quality, the outcome of that, similar to any other data analytics project, that information out is based upon the quality of the information available. And that's certainly a part of a process that needs to be managed along the way.
Data governance as well, so as information is brought in, for example, from a variety of vendors, as that information is being put together, there really needs to be a strategy of how that data is brought in, how it's collected, how it is governed in the sense that there could be a variety of coding methodologies used, and that needs to really be consistent so that the information that is viewed and analyzed can be used across the whole portfolio. If everything changes all the time or it's very different, it's very hard to gather, especially for machine-learning algorithms, for example. If there isn't a sense of consistency, the data mining isn't as valuable, it isn't as powerful, unfortunately.
So what's the roadmap? Over the past year, the customer certainly has seen how the platform operates. And moving forward, it will be looking at the user experiences, personas, and workflows. So let's now, from the user perspective, start to take the case studies, the workflows, the personas that are valuable to the customer and start to marry those up with the needs. The platform is there. It has capabilities. How do we marry the two together?
Expanding the system. So we talked earlier a little bit about there's different ways to expand the system. Take one system at a time and do all the mechanical systems, for example, or take one particular asset, like another runway. That's a little bit of what's happening. There is a plan to keep expanding the system incrementally, so that at the end of the five years, the entire airport is available in the twin. And that's a bit of the ongoing conversation about the best ways to do that.
Continuing the central utility plan integration, as we were mentioning, so that that core bit of infrastructure that feeds the entire airport is available and that information is being connected to the algorithms from NREL. Part of what has been ongoing in terms of the data quality and governance has been bi-directional, moving towards bi-directional integration. So right now, a lot of the information coming in is coming into the twin.
It is not going back out at this point. That is part of the bidirectional integration. And that does require the source systems of record to have a little more ability to understand the information that has been brought together.
Data analysis and reporting, so it's really the next step, as that information data set is being built through this variety of sensor information and other information, to move into analysis and reporting, how to build the business intelligence into the system, as well as the reporting capabilities that allow the user to see the next steps.
Improving the roadmap. So how will we get from the current level of maturity within the model and the capabilities of the model to a system that marries up with the first bullet of what are the experiences and use cases that are the most valuable. And then you see, down here, is use cases. So that's, again, the big part of what the customer is now seeing is the tool is available, how do we add use cases and capabilities that are very valuable to us.
And the last two, especially the second to last, is an important one, the staffing plan. So they have actually brought in, by role, and employee who is the digital twin manager, which I believe is a new role within an organization. You might see this going forward. And then, from this, there will be an organization, a small organization within the airport, that is managing the digital twin. And they're gathering information about how they'll staff that. It's a little different than just one role because there are multiple systems. So it's quite interesting.
And then the return on investment, the ROI. So how does a twin implementation over years develop return on investment? What specifically are the areas that it can provide the most benefit to the customer? And that's part of what they're working on in terms of the value of the twin. As I mentioned, a lot of it is based upon the data capabilities that are being brought in. And that will be part of the journey going forward.
I'd like to thank you all very much for spending some time to learn about this project. And certainly feel free to reach out to me with questions. And I enjoy hearing from you and learning about different ways that others have implemented or are implementing digital twins in technology.