AU Class
AU Class
class - AU

Standardize, Automate, Generate: Transforming Delivery for the Digital Age

Share this class
Search for keywords in videos, presentation slides and handouts:

Description

Digital transformation in the construction industry is inevitable. At Mott MacDonald, we strive to deliver in a digital way as our default across our 16,000-strong global business. Over the last 18 months, the structural engineering practice has undertaken a top-to-bottom review of our way of working, with the aim of standardizing our processes, automating our common types of design, and using generative-design approaches to deliver efficiencies. This class will explain how we set about this, highlighting the problems we have faced and the challenges we have overcome. We will cover how we have identified our common activities, how we selected our preferred software tools, and what we are doing to embed more-advanced workflows as our common way of working.

Key Learnings

  • Understand the reasons behind digital transformation
  • Learn how to evaluate different software tools for suitability
  • Learn how to identify opportunities for improvement in your own business
  • Learn about the need to manage change as well as technical skills

Speakers

  • Ian Besford
    Ian Besford is an associate structural engineer at The Mott MacDonald Group with over 15 years of experience gained across multiple sectors. This experience included the then-pioneering use of digital modeling and visualization tools from the start of his career. Ian was instrumental in driving adoption of modern processes and technology across his team’s delivery, leading to him taking the role of Building Information Modeling (BIM) champion for structural engineering. These skills and processes have most recently been applied to the Leeds Station Southern Entrance for which Ian is project director responsible for Mott MacDonald's delivery.
  • Matthew Pearce
    Matt is a principal structural engineer and digital design leader at Mott MacDonald, with 12 years of experience in designing complex structures across the globe. He started with Mott MacDonald in 2007 and has since worked in UK, Hong Kong, Macau, China, Singapore and now Canada. Matt has worked extensively in the steelwork design field, having worked on the structural designs for Waterloo Station, London 2012 Olympics Shooting Venue and Twickenham Stadium (East Stand Extension) in the UK, Jakarta Velodrome in Indonesia and Wynn Palace in Macau . He currently leads digital design for building structures at Mott MacDonald. He was co-speaker at Autodesk University in 2018.
Video Player is loading.
Current Time 0:00
Duration 43:56
Loaded: 0.38%
Stream Type LIVE
Remaining Time 43:56
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

IAN BESFORD: OK, everyone, they've shut the doors so it's time to kick off. Good afternoon to you all. Thank you for coming along to this. I hope you've all had a fantastic first morning at AU 2018. I guess the idea for this class came out of the number of times we've been here and we've seen a lot of great things that go on. We take that back to work and then sometimes we still want to embed that back in the business of business as usual. And that's been quite a challenge for us. So we were putting some thought into how we could actually articulate that and how we could try and make sure we get more value out of what we do.

And I think there was a particularly relevant point in the keynote this morning about how technology is about removing repetitive tasks from what we do, meaning less friction, and allowing us to improve productivity. And I think that's a theme which hopefully will come out of what we're going to talk about today.

So in terms of the objectives for the class, this is what we set out. And really it's a story of our journey going through how we understand our business from a structural engineering perspective, how we do things, what the skills of our people are, and how we can make them more efficient and more productive and ultimately happier in their jobs.

So this is me. I'm a structural engineer. I have worked for Mott MacDonald for about 20 years now, and have seen a load of different changes come and go over the time that I've been in the organization. I lead a team of about 60 engineers in the eastern part of the United Kingdom. And then, with another hat on, I've also got a role working at the IStructE on their BIM panel as part of their digital transformation team. I'll let Matt introduce himself.

MATT PEARCE: Hi everyone. So yeah, I'm Matt Pearce, principal structural engineer at Mott's. I started with them in 2007, so 11 years ago. I worked firstly in Croydon in South London. I was based in Hong Kong for three years. And for my sins, a steel factory in China for three months. I worked in Singapore and now back in London.

IAN BESFORD: Does anybody know who this is?

AUDIENCE: Simon Sinek.

IAN BESFORD: There we go. So that's good. So his book, Start With Why, I think, is a really good book. And if you haven't read it, I recommend you read it or at least watch his Ted talk. It's not a plug. I've got no relation to him at all. But the reason I mention it is his sort of premise behind the book is that if you want to do anything significant in the way of change, you need to be clear about what the purpose is that you're doing.

So you need to understand your why in the middle of the picture. And it's only when you understand why you need to do it that then you can look at how you might go about achieving it and be clear on that. And then from the how flows the actions that you take, the what that you actually do.

So this is kind of the process that we've gone through to try and work out what we actually need to do in order to embed the learning and improve the way that we deliver our business. So we're going to take you through this why, how, and what through this talk.

So I guess that is the question, isn't it? It's why are we doing this and why do we need to change? And it is a story of change and transformation. And it's based on the need that we need to change the way that we run our business. We need to change and develop the skills that our people have. We need to develop our processes to align with new ways of thinking. And we need to develop and improve the technology that underpins all of that to allow us to work more efficiently. And if we don't do that, the risk is that other companies will come along and they'll overtake us, and effectively we'll become irrelevant.

So we've been through change before, you know, so-- I've only been around 20 years. But in 20 years, a lot has changed. And I suppose the last significant change is if you go back 10 years or so when we started to move from working in AutoCAD to Revit, we saw a massive increase in productivity, which was great for a relatively short period of time. But then the recession hit in the UK and the bottom dropped out of the market. And fees that were relatively comfortable dropped 20%, 30%. And we found that the only way that we could continue to deliver the same quality of service or the only reason that we could was because we'd invested in transforming the way that we'd worked and moving from AutoCAD to Revit. So the fact that fees have gone down, actually, we were still able to deliver a similar quality product and a similar quality of service.

It's a different set of challenges now. I don't think fee levels in the marketplace are kept anywhere near back up where they were, but we've got another wave of change coming. And this change is driven in a slightly different way. It's been driven by clients wanting more for less. They want better asset data at the end of the day. It's driven by them wanting more complicated structures, and architects having the tools to actually design these structures in the first place, mean that we then have to reflect that. And so we have a more complicated job to do. We don't get any more money for that, but we have to do that within in the fee scope that we've got. So again, we're looking at improving efficiency, not because we have to, in this case, but because we need to in order to be able to deliver the stuff that we have to do.

The other challenge that we've got is in organization. We've got about 16,000 people that we employ across the world. Within that organization, we've got a lot of different disciplines. And this diagram here reflects where the structural engineering teams are located. So we've got a big concentration back in the UK and also out in India, but they are existing in, I think, five continents, I think, other than South America and Antarctica. So when you've got people existing in all those different places because they've been acquired by the business over a number of years, they're still quite siloed in the way that they think. They each have their own different approach to doing things. And that's the challenge that we need to try and break down.

So if we're looking at the reason we're doing this, what our why is, then it's about us being able to deliver a leading edge service applying our technology but applying it consistently, and creating people or developing people that have got the right skills to deliver the right project at the right time.

So how do we tackle this change as a group of structural engineers? So some of the problems that we face are these sort of things. So we have 500 engineers and you got 500 people and give them the tools to do the job and people will pick the tool that they prefer. Sometimes people will use a sledgehammer to crack a nut, and that generates inefficiency or it takes too long to do a process and people overthink it. And the classic example of that, other than some idiot with a pillow drill on a seat like that, is graduates that come in and start trying to use a complicated analysis tool in order to deliver a simple structural framework. They spend time and effort doing something which isn't necessary.

We also find that the deliverables that we produce vary significantly s time that it takes to produce. Quality is reasonably consistent, but somebody will produce something a lot quicker than somebody else. And maybe that's tied back to point one, in terms of not actually using the right tool for the job. And then when it comes to tools, we did an audit of software install and we worked out that we had over 1,500 different tools across a practice of 500 structure engineers delivering reasonably common work. So obviously a lot of tools means a lot of effort to manage them. It's very difficult to keep them patched and updated and it's not an efficient way of working.

I mentioned siloed thinking before. The challenge that we get as you get-- and maybe it's a British thing, maybe it's an engineer thing, I don't know-- but you've got people doing good stuff but not actually realizing that what they do is good stuff and so they don't talk about it. So somebody else goes and reinvents the wheel and does the same piece of work over and over again. So creating a culture where people are willing to share those things and think about things is a challenge.

Hardware was also a real issue for us in terms of standard PC specs. So again, we have a library of machines that have been built up over a number of years and those machines have all varying different standards and processes and quality. So when you're trying to standardize what you do, having a common hardware platform to work from is important.

And then the last thing on there is around process and checking. And this was a slightly odd thing because it's something which comes naturally to a lot of people, but actually thinking about how we need to check and approve documentation and how we standardize that so we can evidence it if we need to has been a challenge to do consistently.

So I guess summed up, there's sort of four points that we were looking at here. So how do we do more with less, how do we deliver more consistency, how do we work more closely and collaborate more closely across the organization, and ultimately, how do we transform the way that we do things/

So what did we do to start addressing this? In terms of the scale of the problem, I think when you've got any sort of tricky problem you need to fully understand the baseline that you're working from. So we did a reasonable piece of work looking at what software was used at different places, understanding that actually, there is a need to use different software in different places for statutory reasons so we aren't forcing people to do things that go against what they need to do. Different parts of the business work in different areas and different structural materials are common in different places, so we needed to consider that. And also different bits of the business work in different sectors. So whether you're doing a hospital project or a stadium, you've got a very different set of requirements that you need to accommodate. So we aren't aiming to try and standardize all of that. We accept that you can't do that. But what we do want to do is pull out the common tasks and the common activities and the common processes and apply those consistently.

And then once we had an understanding of those elements of what we do, we looked at the models that we actually use to deliver things. And generally that falls into two categories. So there's the idea of specialty work, which is sort of more complicated, higher risk work. It's work where we can't standardize it as much because it is more specialist and unique, generally it's higher value work, and there's fewer people are capable of delivering it. And then at the other end of the page, you've got sort of scale work, which is your commodity work. It's the standard stuff that we do day in and day out, things where repetition is commonplace. You can pull ideas off other projects. We can use standard libraries of components to make things easier. And it's generally purchased on the basis of cost and fee rather than actually quality of service, because it's fairly standard stuff.

So now that we understand our aims and the fact that we've got two different business models, we developed a sort of a three-stage process to think about how we can undertake a transformation in each of those different areas. So the first two, this idea of standardize and automate [? sit ?] with the scale work. And really that's about gaining efficiency from adopting standard processes and standard tools to deliver a standard way of working. And once we've got that standard way of working it's much easier, then, to automate it as a process. So we went through that exercise and we'll talk through this in a minute, about doing that, creating these standard processes and then looking at what we can automate and how we can gain efficiency from it and how we can use the power of the computing that we've got available to us to do it faster and easier than before.

Then the third bit of it is this idea about generating new design techniques and new solutions to things. So these more difficult projects require us to think a little bit outside the box and come up with new ways of thinking. Thinking about how we can apply technology differently or how we can generate new technologies to apply to problems to solve them. And then ultimately with that, the aim is how we turn that into business as usual and roll it out across the business. And all of this is sort of framed around this idea of delivering more efficiently and more effectively for our clients.

So we're going to talk through each of these three stages now and give you some examples of some of the things that we've done. And I'll hand you over to Matt, who'll start taking you through that.

MATT PEARCE: Thanks, Ian. So the first, as Ian said, the first stage was to standardize things. So that's not just standardizing tools, but at the beginning we looked at the whole lifecycle of the project. So we started with basically like a helicopter view of the project lifecycle, which can be seen as a series of processes of initiating the project, planning the project, executing it, monitoring and controlling it, and then closing the project and learning lessons from it. So we looked to the management of these steps to see what the common requirements are for successful projects, what makes each stage successful and it follows the best practice for that stage.

So what we did at Mott MacDonald was produce a standard project control system called STEP, which provides a standard workflow for the overall running of a project, allowing the project manager and project director to see what the standard control methods should be at each stage of the project. This includes tasks such as developing a project risk register that's then continually monitored through the lifecycle of the project. It also includes project reviews to be carried out at key milestones in the project, such as bid reviews, technical reviews, and data management reviews.

Drilling down, then, to the project execution stage, we then looked to standardize our overall design programs for a given asset type, such as a stadium or a hospital or railway station. The programs are populated with the standard tasks, and also highlighting the incoming information required to produce deliverables. For example, key stages where we will need an architect's Revit model or external survey information, client approvals.

The programs also helped to show the dependencies of the different engineering disciplines within Mott MacDonald, which then enables us to prioritize the different engineering activities to keep everyone on track.

So if we drill down again, further, the integrated design programs need to be based on standard design workflow. So we looked at what are the most common structural building types, such as concrete frames, steel frames, and composite frames. We then identified what are the common key design activities that need to be carried out at each design stage. This example is for a steel frame design. So a scheme stage, so the starting point would be the concept design from the previous stage, which may or may not be a full BIM model at that stage. Depending on the complexity of the frame it might not all be modeled. So then we develop that into a draft schematic model, which would then be analyzed, updated, and finally reviewed, and issued as a stage deliverable.

We developed these workflows initially for the design of the more common frame types, but then expanded it to include other activities such as site inspections, construction stage analysis, connection design and fabrication modeling. We started with the scale workflows, if you like, and then tried to even standardize some more of the specialist services as well.

So now that we have standard process, we can then identify what we need our software tools to do. So I think this is a real change in our mindset, as we could basically write our own performance specification for the design tools for each given building type. So rather than looking at what's on the market and what can the stuff on the market do, it's more looking at what can it do for me, because we already know what the processes are. So what you can see on the screen is a tool we use to compare and select our structural analysis software. So on the left hand side we input the requirements, which are split into the analysis elements, analysis types, the design standards that it could support, and its interoperability with other standard BIM tools such as Revit or Tekla or Rhino.

So this is carried out for a wide range of building types and allowed us to get a picture of which software was appropriate. It also allowed us to build up a profile of how commonly we would need to use certain tools, as Ian was saying, we want to be using the simplest tool for the job but the right tool. For example, we didn't want to be using something like [INAUDIBLE] for a simple office building.

So this is the selection that we came up with. So the weird brain thing in the middle represents data, the I in BIM, which all these tools need to transfer, modify, and analyze. The orange circle represents our standard scale tools. We have Revit and Dynamo and Rhino and Grasshopper for model authoring. Robot and Tekla structural design for structural analysis the Tekla for simpler buildings, ROBOT for moderate complexity buildings. We then have ReCap for managing incoming point cloud survey data. The rest of the tools are basically model review tools, so Navigator for reviewing Bentley models, Navisworks for coordination and clash detection, Design Review [INAUDIBLE] Review for reviewing models in BIM site, for reviewing fabrication models.

On the outer circle are the optional tools. These are tools more linked with specialty or client-driven requirements. So you have like ETABS for high rise design, Midas for underground structures and complex reef structures, and Bentley tools. AECOsims, they have Generative Components. That's typically where the client is specifying it, which is quite common in the UK for-- especially in the rail sector. And then Tekla Structures for fabrication modeling, because at the moment that has the market share in the UK.

Then we finally looked at standardizing our digital deliverables. So this included developing authoring standards in Revit by developing a Mott MacDonald template. We also developed a digital component catalog to host standard components, such as Revit family's for annotation, structural framing, and drawing frames. So that's where we got to standardization.

So now that we've identified the processes and the tools, it's now time to look at what processes can be automated. So for this we have developed a strategy for influencing innovation. It's basically-- what's that? Oh, there you go.

At Mott MacDonald, we've developed a strategy for implementing innovation which design automation falls into. So the key stage of the implementation will identify-- this is looking at the design process maps and identifying areas that can be automated-- assess, so having identified the automation idea, we need to establish how much it would cost to implement, how much time and cost it will save, can it be scaled up around the business, can it be sold as an external tool to other companies, and who or what do we need to do to deliver it.

So if the idea passes this assessment the next stage is to incubate the idea. So now that's developing a pilot implementation of the idea, developing the automation to a level where it can be tested on a selected number of projects, and then again, assessing whether the tool meets the expectations and is it a viable tool that can benefit the wider business. So again, if it passes this stage, we will develop the tool into its final form and then publish the tool to the wider business to allow installation and use.

So just looking at the first step of identifying the automation idea, we created a digital design idea portal on a company internet system SharePoint, which enabled everyone in the company to register an idea. This is great is a first step, but we came to realize that there was so many ideas we needed a formal system to manage and track the ideas. So we are now developing a tool that links into [? Yama, ?] which is our company wide platform for communication of ideas, problems, queries, anything that is not confidential to a project. So using the handle #idea, people can simply post an idea onto [? Yama ?] and the tool will start tracking the idea, linking up the relevant parties.

For example, if it's an idea related to structures, it will notify our structures practice leaders. Once the idea's got enough backing, likes or positive comments on [? Yama, ?] the idea poster will be requested to submit a mini business case for the idea, which would then be reviewed by the relevant practice and knowledge and innovation managers.

So should this idea be successful, we would take this to the next stage of incubation. For this we have developed a tool called ScriptHub, which allows anyone to upload work in progress scripts for Dynamo, Grasshopper, VBA, Python, C#, pretty much everything. These ideas will get reviewed by code chiefs who will help develop the script into a working proof of concept for testing.

Here's an example of Dynamo scripts for building structures, which can be downloaded by anyone in the practice. We have built into these scripts a tracking tool that allows us to monitor how many times the scripts are downloaded, how many times they are run, the location of the user that has downloaded it. This allows us to see how popular the tools are and monitor how much time is being saved using the tools.

So finally, if the automation tool proves popular, we develop it into a full application or plug-in. This example is our Mott MacDonald Revit toolbar, which has been developed to provide automation and efficiency tools for Revit based on user demand. Some of these examples could be seen here. So there's a tool to import and export Excel data, because all engineers love Excel. A tool to embed suitability codes into the Revit elements, to enable people using the model to easily see the status of the model.

So then that's the sort of everyday stuff. So what about what about the more interesting generative design? So this is really how we can use technology to come up with new solutions and really push the boundaries of design. So this is an example that we developed that is a tool to generate and optimize the design of piles. So it links together borehole data, soil topology, structural loads from the robot analysis model to carry out the design and then generate the piles into the Revit model. So this allows us to very quickly design and optimize the piles. We developed this on the back of a power station project in Singapore, where we had an incredibly tight deadline to deliver the structural quantities.

Another example is a generative design tool for steel trusses. So this workflow shows the steps, the stages in this process. Firstly, the roof profile is taken from a surface in Revit. We then create the boundary conditions in Grasshopper and using Karamba to optimize the truss topology. And then this is pushed back to Revit. So I'm hoping this will work. So this showing that the surface being cut and extracted into Dynamo. So this is one option to take it from Dynamo into robot for analysis.

And then this is an example of using Karamba and Grasshopper to generate the optimum topology for a truss. If you stare at it too long you go mad.

Another example of generative design that we've been working on is the optimization of geometry to minimize steel weight. So these examples from left to right are a stadium project in China. Lead station, which is presented here a couple of years ago, and an airport project in Asia. So here we're using Grasshopper linked directly to Robot which allows us to quickly assess a large number of options very quickly and come up with the optimal solution in terms of steel tonnage, number of elements, and the complexity of connections. I'll hand it back to Ian.

IAN BESFORD: Cheers, Matt. And there's other ideas which come out of other ideas which we work in as a business. So we have a large coastal engineering team, and there's things that come out of that which might have potential to be applicable to the structural engineering field at some point in the future. So coastal engineers do a lot of numerical modeling and it's a very time-consuming exercise. The models have a very long run time. And so they came up with a way of using neural networks to simulate the output of a numerical model. The benefit of that is rather than taking 16 days to run a preliminary numerical model, they can get a similar result in four and a half seconds, which sounds like a really good increase, but it's not without its restrictions as to what it can do.

So the challenge with neural networks is they're only as good in the output as what you train it with the input. So in this particular case, the network was trained with the same input data as the numerical model and, therefore, it's iterated through and the output is very similar. If you present it with a slightly different problem it will give you a reasonable result. If you present it with a very different problem it will give you a result which is complete rubbish. But the principle behind using different tools and different technologies like this to generate different design results, I think, is something which we're keen to investigate how we can apply that further outside of the sphere that this was developed for.

The other bits that we were looking at as well is how we can use augmented and virtual reality technologies to actually improve our workflows and our processes from a building engineering point of view. And all the great examples and visualizations are fantastic, but from a design point of view, they don't actually help you progress the design. So we have an in-house team that's sort of built our own immersive experience, overlaying on it so that we can actually do 3D design mock-ups and redlining and model reviews in a virtual environment. And we can take that information back into the desktop and then start to review those models in the same way you would do analysis works. Which is useful in some situations, because it's only when you actually start to look at something you can see things.

We also have other modules built in there as well, which are useful from a builder's point of view. So we have a fire engineering module which looks at fire escape and smoke modeling, and so we can sit down with the statutory authorities and review with them the solution that they're going to get so at the end of the day, there's no surprises.

And then the other example on there as well is how we can use the same sort of thing to engage with clients to do ONM reviews so that they actually understand the maintenance implications of what they're getting. Ultimately the output of that that we're looking for is that we get more effective and more efficient data management handover and we get safer buildings to operate.

So I guess all of this is fine, but having been through the why, the how, and the what, is that so what? What's the output? What impact has it actually had on the business at the end of the day? And it's important that we can measure that impact, because we spend money doing this type of stuff. And if we don't know whether or not it's been successful or it's had any savings, then we can't justify investing in further things. So we try and put as many metrics behind this. Engineers like metrics. If we can measure it, then that's good. And we try and measure what's important to us. So as a business from this, ultimately, it's about saving money and doing things more efficiently.

So a good example is the Dynamo scripts, which Matt mentioned before. So each of those Dynamo scripts has a custom node in it which reports back to a central database how many times it's been used. So if we can estimate the saving that node generates or that script generates when it's in use, then we can very easily produce metrics which demonstrate how much value it's added to the business. So then we can dashboard it up and we can get figures like we saved this amount of money using this particular thing.

The other thing that's come out of this is it's giving us an ability to reflect on what we've done and decide whether or not we focused in the right area. So we started off by looking at sort of big ideas, which had a big impact. But actually, the amount of times they used wasn't great. So we generated a good saving on it. But when we went back and looked at some of the smaller ideas we'd come up with and implemented, we discovered that actually a small idea that's quick to implement and used a lot of times has a much bigger impact in terms of bottom line benefits.

So the other bit of the impact is-- obviously there's financial, but then what about the impact on the people we're working with? Because we are a people-based organization. We don't make widgets and sell widgets. We market the skills and experience of our people. And I guess what we're trying to do is move from-- well, it's a while since we moved from the top area where we had sort of the current workforce and we have this sort of digital layer wrapped around them of people who had the knowledge and the experience, and we've kind of moved that down now to where we've got that digital talent spread better across the business, but it's still in pockets. It's still in individual people and you've got a lot of people who don't have this same sort of level of knowledge and skills to be able to do the role that we need them to do.

And so where we're aiming for is down at the bottom, where everybody in the business has the right level of skills and knowledge and experience from a digital sphere to do their role we need them to do. And I guess when you get to that point, the whole digital transformation thing is complete and it is just business as usual. And it's really important that we get to that stage as quickly as we can, because I think in this marketplace change happens so quickly that it's not going to be a case that a big company will come along and gobble up a small company anymore. It'll be the faster, more agile, more nimble companies which will out-maneuver the slower, old fashioned company.

And I mentioned at the beginning that it is about people. And as engineers, we naturally want to innovate and we naturally want to try and optimize what we do and be as efficient as we can. And I guess all of this is about giving them the tools to do that so that they can focus their time and their effort, which is expensive time and effort, on the more interesting stuff, on the fancier stuff. So that they don't have to waste their time or spend time which is less productive on simple tasks which we can get automated. By doing that, we hope that we keep people engaged and we keep people challenged. And if people are engaged and challenged and happy at what they do, they deliver a great service and they deliver a great product. And at the end of the day, if you've got good people delivering a great product, that gives us happy clients. And that's what we're looking for, at the end of the day, because happy clients give us repeat business, and then the circle is complete.

I think it's fair to say we showed you some of what we've done. We're not anywhere near the end of the journey. We're well on the way and we know where we're heading, but maybe in two or three years time we might have reached the end of it.

So that's pretty much the end of what we wanted to talk about. It would be nice to just give you a reminder to fill in the class survey, and the app feedback is appreciated. Also, a reminder about the socials and the meetups that are ongoing throughout the rest of the day, which you can find about in the app. And if you want to come and talk to us and find out more about it or you've got any questions you want to ask us individually that's fine. Or if you've got any questions, we're happy to take them now.

[APPLAUSE]

AUDIENCE: When you standardize software and pushing out across [INAUDIBLE] device, is it felt like [INAUDIBLE] reviews, is the pushback aggressive, not much? [INAUDIBLE]

IAN BESFORD: Yeah, different people respond in different ways, I think. There's some people-- and it's typically the same people all the time-- who are more embracing of change and don't mind changing what they do. I think it's important that we don't just push it out and don't tell them we're going to do it. You've got to engage with them and make sure they've got the training and the support that they need to be able to transfer. We're just about to the point now where we want to push a standard set of software out to every structural engineer so they'll all have the same stuff. It'll be interesting to see what the pushback from that is.

AUDIENCE: Do you have a feel? Is it 10% people pushback and 10% love it? [INAUDIBLE]

MATT PEARCE: I'd say certainly most the engineers and graduate engineers are happy to get the tools and the training. It's like an opportunity for them to use the best tools and get trained on it. So I think there's pushback from maybe some of the senior staff, but again, it's is more of a-- there's some internal marketing they have to do to show that this does help you with your work. There's a bit of a learning curve at the beginning, but--

IAN BESFORD: And I think you've reached the point as well where-- may sound blunt, but if we've done the evaluation and decided that's the most efficient way to run the business, then some people either need to get on board with it or maybe they're not a right match.

AUDIENCE: So do you find people saying, though that-- you standardized the software I'm allowed to use. Do you find people say, hey, you're actually trying to stable me. You're saying I can only use these tools. I've actually heard a little bit of this myself, so I'm struggling [INAUDIBLE] a part of that conversation.

IAN BESFORD: So there's always a case for having a standard and having a process. And I guess the whole business is built around having standards and processes and stuff. Ultimately, on any particular project, you can choose to do something differently as long as you've got a justification for why you're doing it. And that's a great-- so if there's a particular need for somebody to do something really different and they can demonstrate why they should be doing something different, then that is absolutely fine. But what they wouldn't get with that is the support that you get around having a managed set of software and a managed set of packaging and the training pathway that aligns with it.

AUDIENCE: I think training is a big component of that because I think a lot of people would just push back because they're just unsure.

IAN BESFORD: Yeah.

AUDIENCE: What does this really mean for me?

IAN BESFORD: Yeah, people are nervous of change, and part of this process is going through and talking them through and getting their buy-in to why we're doing it. And ultimately the reason we do it is to make the business more efficient, but also to give them the opportunity to develop their skills. Yeah.

AUDIENCE: How do [INAUDIBLE] relevant?

IAN BESFORD: Of the enhancement?

AUDIENCE: You said 1.7 within BOM.

IAN BESFORD: Yeah

AUDIENCE: --savings that--

IAN BESFORD: Could have put 1.8 up there.

MATT PEARCE: Yeah, give or take.

AUDIENCE: How do you come to-- OK, 1.7. [INAUDIBLE]

IAN BESFORD: Yeah, so--

AUDIENCE: How do you quantify it into a financially constant [INAUDIBLE]?

IAN BESFORD: So it all-- I guess it all comes from time savings. So--

AUDIENCE: Time saving?

IAN BESFORD: Time, time is money, isn't it? Yeah. So if we have a script and we test that script and we think it saves five minutes and we know that script's being run 1,000 times, then it just works on the basis of that. And it is little bit [INAUDIBLE], I admit that. But it's better than not having anything to justify it.

AUDIENCE: Yeah. [INAUDIBLE] management-- convincing the management of that number [INAUDIBLE].

IAN BESFORD: Yeah.

AUDIENCE: And how you arrived at the number because it's a little bit of a tricky situation.

IAN BESFORD: It is important, because I think generally people are quite good at doing something and writing a plan and then monitoring how well we complete the plan and go, excellent, we completed the plan, that's brilliant. But actually, what impact does the plan had on the business? What changes have made? So we've gone through this process here, but we needed to have some way of measuring that. So that's the way of measuring it.

AUDIENCE: As you [INAUDIBLE] end of this initiative, you're working on it now or is it [INAUDIBLE] forever, [INAUDIBLE]?

IAN BESFORD: I'll let Matt answer that one.

AUDIENCE: All right. The standard software rollout's happening this month, so that's going to be 120 computers to start with. And then next month is the rest of the building structure, so it's the built environment that's about 1,000 computers next month. And then actually, we're installing the new software and then there's going to be a process of removing the old software. We're giving them a three month grace period, like get upgraded and get on get on board and then--

IAN BESFORD: Imagine the emails complaining about [INAUDIBLE].

MATT PEARCE: That's when we're going to get the bite back. But in seriousness, if it's on a live project and they're using a tool and they need it to finish a job, we're not going to pull it of there. So we need like an exceptions list. But it is in process. And I'd say we started this process last year, didn't we? And it's taken a year to get to here.

IAN BESFORD: I think the thing is that I don't think we'll ever fully finish it. I think we're doing it. We'll try it. If it doesn't work or something doesn't work, then we need to take a step back and review why it doesn't work and look at doing something different. It's not drawing a line and just saying, yeah, we've done it now, we're off to the pub. Which would be nice. But I think we need to do it. And things like the standard software list as well, because the tools change all the time. So what is the right tool for us to do the job now might not be the right tool in 12, 18 months' time.

So part of this is also engaging with our suppliers now that we have a preferred toolset. We have those discussions and say this is the preferred tool, but we need to understand where it's going from a development point of view so we can plan how it's going to change.

AUDIENCE: [INAUDIBLE]

IAN BESFORD: We'll see. The proof is in the pudding, isn't it?

AUDIENCE: And you talked about measuring the uptake in [INAUDIBLE] with a custom node. How do you measure the uptake of all of the other stuff?

MATT PEARCE: So the Revit toolbar's also got a script in it that tracks the number of clicks and number of times that the buttons have been used, and that feeds back to an Azure database, and then that's read through Power BI so they can see yeah, for their automation, anything that digital-- we've got like a digital design team that do the automation and programming for us and all of their tools have got these tracking tools built into them.

AUDIENCE: So they can process the [INAUDIBLE] script [INAUDIBLE].

MATT PEARCE: Yeah, that's harder to measure, I would say, some of the process-- there's the tools but there's also the process of linking the tools and there's the softer stuff that isn't clicking buttons. It's different ways of communication.

IAN BESFORD: And I think it relies on feedback, as well. We get feedback from people if this is useful or isn't useful. And then we have to listen to that and review whether or not we need to do something about it. Yeah. I'd rather get something out there and get them to try it and see whether they think it's efficient and save their time, and then change it if we need to.

AUDIENCE: Going back to the pushback thing, you're measuring [INAUDIBLE]. I think that's still the number of [INAUDIBLE] you might come to a point [INAUDIBLE] where you might not want to use the other stuff because the ones-- [INAUDIBLE] 100, probably 80 of them will be doing the same job of 100 because all of the [INAUDIBLE] people will need to be [INAUDIBLE].

IAN BESFORD: It's an interesting point, isn't it? I guess we look for the efficiency savings so that we can deliver more work with the same team of people than we've got and we can target ourselves to do bigger projects and more interesting projects or different types of stuff. The counterpoint to that is if the work isn't there and we can deliver it more efficiently, the logical conclusion would be that you'd have to find other roles or stuff to do, or retrain them. I guess the benefit we have as an organization is we deliver stuff in so many different areas that people are quite flexible and it wouldn't be a case of booting them out the door. It would be a case of finding something that suited their talents and aspirations. Maybe. We'll see.

OK, if there's no other questions, that's great. Thanks very much for coming, and enjoy the rest of AU.

[APPLAUSE]

Downloads

______
icon-svg-close-thick

Cookie preferences

Your privacy is important to us and so is an optimal experience. To help us customize information and build applications, we collect data about your use of this site.

May we collect and use your data?

Learn more about the Third Party Services we use and our Privacy Statement.

Strictly necessary – required for our site to work and to provide services to you

These cookies allow us to record your preferences or login information, respond to your requests or fulfill items in your shopping cart.

Improve your experience – allows us to show you what is relevant to you

These cookies enable us to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we use to deliver information and experiences tailored to you. If you do not allow these cookies, some or all of these services may not be available for you.

Customize your advertising – permits us to offer targeted advertising to you

These cookies collect data about you based on your activities and interests in order to show you relevant ads and to track effectiveness. By collecting this data, the ads you see will be more tailored to your interests. If you do not allow these cookies, you will experience less targeted advertising.

icon-svg-close-thick

THIRD PARTY SERVICES

Learn more about the Third-Party Services we use in each category, and how we use the data we collect from you online.

icon-svg-hide-thick

icon-svg-show-thick

Strictly necessary – required for our site to work and to provide services to you

Qualtrics
We use Qualtrics to let you give us feedback via surveys or online forms. You may be randomly selected to participate in a survey, or you can actively decide to give us feedback. We collect data to better understand what actions you took before filling out a survey. This helps us troubleshoot issues you may have experienced. Qualtrics Privacy Policy
Akamai mPulse
We use Akamai mPulse to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Akamai mPulse Privacy Policy
Digital River
We use Digital River to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Digital River Privacy Policy
Dynatrace
We use Dynatrace to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Dynatrace Privacy Policy
Khoros
We use Khoros to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Khoros Privacy Policy
Launch Darkly
We use Launch Darkly to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Launch Darkly Privacy Policy
New Relic
We use New Relic to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. New Relic Privacy Policy
Salesforce Live Agent
We use Salesforce Live Agent to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Salesforce Live Agent Privacy Policy
Wistia
We use Wistia to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Wistia Privacy Policy
Tealium
We use Tealium to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Tealium Privacy Policy
Upsellit
We use Upsellit to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Upsellit Privacy Policy
CJ Affiliates
We use CJ Affiliates to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. CJ Affiliates Privacy Policy
Commission Factory
We use Commission Factory to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Commission Factory Privacy Policy
Google Analytics (Strictly Necessary)
We use Google Analytics (Strictly Necessary) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Strictly Necessary) Privacy Policy
Typepad Stats
We use Typepad Stats to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. Typepad Stats Privacy Policy
Geo Targetly
We use Geo Targetly to direct website visitors to the most appropriate web page and/or serve tailored content based on their location. Geo Targetly uses the IP address of a website visitor to determine the approximate location of the visitor’s device. This helps ensure that the visitor views content in their (most likely) local language.Geo Targetly Privacy Policy
SpeedCurve
We use SpeedCurve to monitor and measure the performance of your website experience by measuring web page load times as well as the responsiveness of subsequent elements such as images, scripts, and text.SpeedCurve Privacy Policy
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Improve your experience – allows us to show you what is relevant to you

Google Optimize
We use Google Optimize to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Google Optimize Privacy Policy
ClickTale
We use ClickTale to better understand where you may encounter difficulties with our sites. We use session recording to help us see how you interact with our sites, including any elements on our pages. Your Personally Identifiable Information is masked and is not collected. ClickTale Privacy Policy
OneSignal
We use OneSignal to deploy digital advertising on sites supported by OneSignal. Ads are based on both OneSignal data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that OneSignal has collected from you. We use the data that we provide to OneSignal to better customize your digital advertising experience and present you with more relevant ads. OneSignal Privacy Policy
Optimizely
We use Optimizely to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Optimizely Privacy Policy
Amplitude
We use Amplitude to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Amplitude Privacy Policy
Snowplow
We use Snowplow to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Snowplow Privacy Policy
UserVoice
We use UserVoice to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. UserVoice Privacy Policy
Clearbit
Clearbit allows real-time data enrichment to provide a personalized and relevant experience to our customers. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID.Clearbit Privacy Policy
YouTube
YouTube is a video sharing platform which allows users to view and share embedded videos on our websites. YouTube provides viewership metrics on video performance. YouTube Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Customize your advertising – permits us to offer targeted advertising to you

Adobe Analytics
We use Adobe Analytics to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Adobe Analytics Privacy Policy
Google Analytics (Web Analytics)
We use Google Analytics (Web Analytics) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Web Analytics) Privacy Policy
AdWords
We use AdWords to deploy digital advertising on sites supported by AdWords. Ads are based on both AdWords data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AdWords has collected from you. We use the data that we provide to AdWords to better customize your digital advertising experience and present you with more relevant ads. AdWords Privacy Policy
Marketo
We use Marketo to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. We may combine this data with data collected from other sources to offer you improved sales or customer service experiences, as well as more relevant content based on advanced analytics processing. Marketo Privacy Policy
Doubleclick
We use Doubleclick to deploy digital advertising on sites supported by Doubleclick. Ads are based on both Doubleclick data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Doubleclick has collected from you. We use the data that we provide to Doubleclick to better customize your digital advertising experience and present you with more relevant ads. Doubleclick Privacy Policy
HubSpot
We use HubSpot to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. HubSpot Privacy Policy
Twitter
We use Twitter to deploy digital advertising on sites supported by Twitter. Ads are based on both Twitter data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Twitter has collected from you. We use the data that we provide to Twitter to better customize your digital advertising experience and present you with more relevant ads. Twitter Privacy Policy
Facebook
We use Facebook to deploy digital advertising on sites supported by Facebook. Ads are based on both Facebook data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Facebook has collected from you. We use the data that we provide to Facebook to better customize your digital advertising experience and present you with more relevant ads. Facebook Privacy Policy
LinkedIn
We use LinkedIn to deploy digital advertising on sites supported by LinkedIn. Ads are based on both LinkedIn data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that LinkedIn has collected from you. We use the data that we provide to LinkedIn to better customize your digital advertising experience and present you with more relevant ads. LinkedIn Privacy Policy
Yahoo! Japan
We use Yahoo! Japan to deploy digital advertising on sites supported by Yahoo! Japan. Ads are based on both Yahoo! Japan data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Yahoo! Japan has collected from you. We use the data that we provide to Yahoo! Japan to better customize your digital advertising experience and present you with more relevant ads. Yahoo! Japan Privacy Policy
Naver
We use Naver to deploy digital advertising on sites supported by Naver. Ads are based on both Naver data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Naver has collected from you. We use the data that we provide to Naver to better customize your digital advertising experience and present you with more relevant ads. Naver Privacy Policy
Quantcast
We use Quantcast to deploy digital advertising on sites supported by Quantcast. Ads are based on both Quantcast data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Quantcast has collected from you. We use the data that we provide to Quantcast to better customize your digital advertising experience and present you with more relevant ads. Quantcast Privacy Policy
Call Tracking
We use Call Tracking to provide customized phone numbers for our campaigns. This gives you faster access to our agents and helps us more accurately evaluate our performance. We may collect data about your behavior on our sites based on the phone number provided. Call Tracking Privacy Policy
Wunderkind
We use Wunderkind to deploy digital advertising on sites supported by Wunderkind. Ads are based on both Wunderkind data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Wunderkind has collected from you. We use the data that we provide to Wunderkind to better customize your digital advertising experience and present you with more relevant ads. Wunderkind Privacy Policy
ADC Media
We use ADC Media to deploy digital advertising on sites supported by ADC Media. Ads are based on both ADC Media data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that ADC Media has collected from you. We use the data that we provide to ADC Media to better customize your digital advertising experience and present you with more relevant ads. ADC Media Privacy Policy
AgrantSEM
We use AgrantSEM to deploy digital advertising on sites supported by AgrantSEM. Ads are based on both AgrantSEM data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AgrantSEM has collected from you. We use the data that we provide to AgrantSEM to better customize your digital advertising experience and present you with more relevant ads. AgrantSEM Privacy Policy
Bidtellect
We use Bidtellect to deploy digital advertising on sites supported by Bidtellect. Ads are based on both Bidtellect data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bidtellect has collected from you. We use the data that we provide to Bidtellect to better customize your digital advertising experience and present you with more relevant ads. Bidtellect Privacy Policy
Bing
We use Bing to deploy digital advertising on sites supported by Bing. Ads are based on both Bing data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bing has collected from you. We use the data that we provide to Bing to better customize your digital advertising experience and present you with more relevant ads. Bing Privacy Policy
G2Crowd
We use G2Crowd to deploy digital advertising on sites supported by G2Crowd. Ads are based on both G2Crowd data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that G2Crowd has collected from you. We use the data that we provide to G2Crowd to better customize your digital advertising experience and present you with more relevant ads. G2Crowd Privacy Policy
NMPI Display
We use NMPI Display to deploy digital advertising on sites supported by NMPI Display. Ads are based on both NMPI Display data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that NMPI Display has collected from you. We use the data that we provide to NMPI Display to better customize your digital advertising experience and present you with more relevant ads. NMPI Display Privacy Policy
VK
We use VK to deploy digital advertising on sites supported by VK. Ads are based on both VK data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that VK has collected from you. We use the data that we provide to VK to better customize your digital advertising experience and present you with more relevant ads. VK Privacy Policy
Adobe Target
We use Adobe Target to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Adobe Target Privacy Policy
Google Analytics (Advertising)
We use Google Analytics (Advertising) to deploy digital advertising on sites supported by Google Analytics (Advertising). Ads are based on both Google Analytics (Advertising) data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Google Analytics (Advertising) has collected from you. We use the data that we provide to Google Analytics (Advertising) to better customize your digital advertising experience and present you with more relevant ads. Google Analytics (Advertising) Privacy Policy
Trendkite
We use Trendkite to deploy digital advertising on sites supported by Trendkite. Ads are based on both Trendkite data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Trendkite has collected from you. We use the data that we provide to Trendkite to better customize your digital advertising experience and present you with more relevant ads. Trendkite Privacy Policy
Hotjar
We use Hotjar to deploy digital advertising on sites supported by Hotjar. Ads are based on both Hotjar data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Hotjar has collected from you. We use the data that we provide to Hotjar to better customize your digital advertising experience and present you with more relevant ads. Hotjar Privacy Policy
6 Sense
We use 6 Sense to deploy digital advertising on sites supported by 6 Sense. Ads are based on both 6 Sense data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that 6 Sense has collected from you. We use the data that we provide to 6 Sense to better customize your digital advertising experience and present you with more relevant ads. 6 Sense Privacy Policy
Terminus
We use Terminus to deploy digital advertising on sites supported by Terminus. Ads are based on both Terminus data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Terminus has collected from you. We use the data that we provide to Terminus to better customize your digital advertising experience and present you with more relevant ads. Terminus Privacy Policy
StackAdapt
We use StackAdapt to deploy digital advertising on sites supported by StackAdapt. Ads are based on both StackAdapt data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that StackAdapt has collected from you. We use the data that we provide to StackAdapt to better customize your digital advertising experience and present you with more relevant ads. StackAdapt Privacy Policy
The Trade Desk
We use The Trade Desk to deploy digital advertising on sites supported by The Trade Desk. Ads are based on both The Trade Desk data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that The Trade Desk has collected from you. We use the data that we provide to The Trade Desk to better customize your digital advertising experience and present you with more relevant ads. The Trade Desk Privacy Policy
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

Are you sure you want a less customized experience?

We can access your data only if you select "yes" for the categories on the previous screen. This lets us tailor our marketing so that it's more relevant for you. You can change your settings at any time by visiting our privacy statement

Your experience. Your choice.

We care about your privacy. The data we collect helps us understand how you use our products, what information you might be interested in, and what we can improve to make your engagement with Autodesk more rewarding.

May we collect and use your data to tailor your experience?

Explore the benefits of a customized experience by managing your privacy settings for this site or visit our Privacy Statement to learn more about your options.