AU Class
AU Class
class - AU

BIMbeats: Real-Time Data Analysis for Revit, Dynamo, BIM 360, AutoCAD, and More

共享此课程
在视频、演示文稿幻灯片和讲义中搜索关键字:

说明

Many organizations are undergoing digital transformation to prepare for a data-driven future. A measurement of the success of your strategy can be captured through a data analysis of how your organization uses BIM (Building Information Modeling) tools. Using BIMbeats, you can achieve real-time data capture. BIMbeats uses the Elastic stack to transform data into insights, and ingests and stores data from multiple formats to provide real-time dashboards in multiple formats, including Microsoft Power Bi, Tableau, and Kibana. By capturing user activity and processing log/journal files in a digestible format, BIMbeats can identify the super users who can automate mundane tasks and those who may need a little help breaking old habits. It can also check whether your projects are meeting model- and information-based deliverables and support ISO19650 auditing. Triggers can also be set to take immediate action when company standards or best practices are not being followed.

主要学习内容

  • Learn how to proactively develop learning and development plans through your own company's data insights
  • Discover whether your company and project-based modeling standards are being followed
  • Learn how to measure the time your teams take to complete tasks in order to better resource and cost future projects
  • Check how your company’s digital transformation strategy is tracking through real-time dashboards

讲师

  • Matt Wash
    Matt has over 25 years experience in the AEC industry. Combining his skills as an engineer with his experience as a technician, Matt is keen to maximise the benefits of the BIM process and implement Lean Construction principles. Matt completed his post graduate certificate in Building Information Modelling and Integrated Design in 2013 with the University of Salford. Matt is a regular speaker at the Autodesk University and BILT Conferences, and won the top rated speaker at BILT Asia 2017, and most recently won a top rated class award at AU 2021.
  • Adam Sheather
    Adam Sheather is an associate director at AECOM, and his role is focused on the developing systems and people skills to deliver Building Information Modeling (BIM) and DE project deliverables to clients across AECOM’s market sectors. This involves supporting and upskilling the internal teams to deliver workflows and decision making to ensure best value for AECOM’s clients, and freeing up the talented team to work on design rather than focus on output. He works as part of the BIM Advisory Group, providing technical and strategic advice relating to contracts, BIM execution plans, project deliverables, support, management, and system tools relating to AECOM’s BIM management projects and BIM-to-FM Integration Solutions. Sheather’s other role is to manage the scope and product development of new applications working with AECOM’s Technology as a Service and Geographic Information Systems (GIS) Teams. This role identifies new product offerings for internal and external customers. It also identifies scope, budget, development and coding, and testing support to see these products integrated into the business lines.
Video Player is loading.
Current Time 0:00
Duration 38:03
Loaded: 0.43%
Stream Type LIVE
Remaining Time 38:03
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

MATT WASH: Hi, this is Matt Wash for Autodesk University 2021 presenting on BIMbeats, real-time data analysis for Revit, Dynamo, BIM360, AutoCAD, and more. So a little bit about me, I've been in the AEC industry for 25 years now. I started out as a structural technician and moved to a structural engineer position with Arup.

I guess I was pretty upset about the way that the engineering and technician world worked and that there was a lot of inefficiency in moving information from the engineers to technicians, which is why I made the transition from a technician to an engineer so that I could do both the analysis and the documentation and remove a lot of those inefficiencies that were present.

So after doing that for around 20 years, I wanted to extend it beyond just structures because I could see there was a lot of inefficiency between engineers and architects. So I joined BVN Architects as a design technology specialist focusing on removing a lot of those inefficiencies using the tools, obviously, Autodesk, Revit, and then Dynamo, and then moving across a whole suite of other tools.

And then in the last seven months, I've moved from BVN to Autonomation who are part of the Bad Monkeys group who most people are aware of. So we do a lot of work on obviously automating processes across the whole AEC industry. And I guess probably the most important thing is outside of work, I love a good craft beer, and that's anywhere probably over the last 25 years. I really miss not being in Vegas doing this presentation having been there for the last two or three years because you guys have got some really good beers out there. So hopefully, one day, I'll be back to present at AU in Vegas.

So a quick class summary, what we're going to go through today, we're looking at organizations, and most organizations are undergoing digital transformation strategies right now. So we want to look at how we can measure the success of that strategy through capturing some of the data that we're getting within our BIM models. So this talk is going to talk around the challenges and the opportunities of capturing company-wide BIM metrics and how we can develop actionable insights for reducing downtime and increasing productivity.

So some learning objectives, we're going to look at how we can develop some training plans through the company data insights, discover if modeling standards are being followed, measure the time that it takes teams to complete tasks to plan for better future projects, and to look at how your digital strategy is tracking on real-time dashboards, and again, those tips for improving productivity and reducing downtime.

So let's go back to 2017-- so this is when I first joined BVN Architects-- and have a look at how we were capturing data on model health within Revit. So a colleague of mine, Andrew Maher, had generated these awesome dashboards that were capturing the major metrics across our projects, but it was pretty static in the way that we were capturing it. And it wasn't that easy to scale this across every single project and capture the data with the frequency that we wanted to capture it. But it was a really good starting point.

So the proof of concept was that we would take the Revit model. We would use Dynamo to extract some of that information into Excel, and then we would use Power BI to read Excel to produce those dashboards that we just saw. So this was a really good MVP. It proved that we could extract this information, and it was really useful information. But it did require quite a significant human effort to do that, and it wasn't possible to capture some of the user activity that we were really keen to understand. We wanted to understand how long it was taking for files to open and what users were doing in the models and do that in real time.

So what we did next was we had an internal software developer, Dan Rummery, and Dan developed this tool called the Revit Batch Processor, which is actually open source and available on GitHub. And what this was able to do was able to process those Dynamo scripts across all of our files and do this overnight so that it wasn't impacting that human interaction with capturing that data and feeding it into Power BI. So the Revit Batch Processor essentially can be used to do anything you do on a single file with the Revit API or a script. You can now do that on many in an automated way. So that was really useful.

But some of the challenges that we had that couldn't be solved with that existing solution was that it was still pretty static. It wasn't in real time. It did require Dynamo and some human intervention, and the scalability issue was still a problem. We wanted to capture this on every single project, and we didn't want to start getting heaps and heaps of Excel files on our network.

And getting the trending metrics about how a file was performing over time and how frequently we'd do that, it was very difficult to ascertain whether we'd do it on a daily basis, a weekly basis, or fortnightly basis. Some projects wanted it more frequently than others, and it just got quite hard to be able to do that.

And one of the other things that we wanted to learn was that in some of the data that we were capturing, we wanted to know when those things were happening at the time it was happening. So for an example, if we wanted to find out when a CAD file was being imported into a Revit model, we weren't able to do that with the solution that we had.

And we had limited metrics on the interactions that the users were having with the model, and we couldn't capture those time-based metrics. And when we were looking at standard schema, the schema that we'd come up with how we were capturing those fields was kind of our own schema, and it wasn't able to be transferred across multiple products if we wanted to do that.

So when we first started looking into this, it was something that the BIM managers were really keen on understanding. They wanted to understand from a training point of view how they could target training better, who were the power users, who were the people that needed a little bit more help, how people were using the tools, how our plugins were being used, and all sorts of things. But it started off very much focused on BIM managers.

But then when we started looking at tools like Dynamo, we wanted to understand what was the investment of time with software developers. Were we going to hire some people to build these tools to automate a lot of the manual processes? And what was the return on investment for that? If we built a whole series of scripts, and they weren't being used, then was it because the script wasn't doing what it needed to do? Was it because we didn't have the right training in place? Or the opposite, were those scripts being used far more often than we ever imagined and was reducing all of that manual time so we could invest more in the software developers?

And from a user point of view, when it came to doing an appraisal each year and understanding where their competency was at and where they needed to upskill, we weren't able at this stage to be able to take that information and understand how to develop training programs for those guys.

And then from an organizational level, we started thinking, well, how can we take this much further and understand what the real digital footprint of the organization was? Just we were focused on Revit to start with, but we thought, well, if we can expand this, we can really get a really good understanding of how the organization is using all of these tools and how does that affect our investment in IT and how does it affect our investment in people.

From a project manager level, we would have architects or senior architects who would be directing their team to do certain tasks agreeing to certain milestones, but sometimes they didn't really have an idea of how long those tasks were going to take so they were agreeing to milestones issue dates and then having to work out what resource they needed to put on it, but it was really a bit of a guessing game. So we weren't capturing the time it was taking to do things so that that could inform future projects to better resource those projects.

And from an IT perspective purely, we weren't able to understand when Revit was crashing, why it was crashing. Was it the user? Was it the file? Or was it the infrastructure? So we realized pretty quickly that it was far bigger than just the BIM managers that wanted to understand this information. It was all these different groups could benefit from capturing this data.

So we had to look at what options were out there. And we had a look at the Proving Ground Apps. We had a look at Deep Space Sync. We had a look at Guardian. And we had a look at Unifi Analytics. We were using Unifi to manage our content. And all of these solutions were really good at the things that they targeted to do. And in fact, all of these are actually complementary to the solution that we ended up choosing. But the reason that we chose the solution that we chose was because it was such a broad range of things that it could do. But that's not to say that any one of these tools wouldn't complement the solution that we chose.

So what else is there? So we ended up with BIMbeats. So BIMbeats was able to combine model and user data. We were able to capture duration metrics. It was all in real time, and it was fully scalable. And when I say scalable, it wasn't just scalable from a Revit point of view. We were able to tap into all of these pipelines. So we were able to look at Revit. We were able to look at Dynamo. We were able to use Model Checker, BIM360, BCF files, IFC files, Rhino, Grasshopper, Navis, AutoCAD. The list just went on. So this really opened our eyes to having this real understanding of the digital footprint of the company across all of the products and not just a focus on one product.

So who are BIMbeats? So the guy on the left and the guy in the middle are pretty well known in the industry. So Adam Sheather who is actually also based in Brisbane, which was another reason that we chose BIMbeats, and I've known Adam for a very long time. I was talking to Adam about some of the challenges we were having, and he said, hey, we're developing this tool, and I think it's going to fit really well with what you do.

So Adam is one of the founding members of the Bad Monkeys group along with Konrad Sobon, and these guys are obviously very familiar in the AEC space, particularly within Revit and Dynamo being the creator of probably the biggest Dynamo package with Archi-lab and Adam's package of Dynaworks. So Adam's company is Autonomation, and that is part of the Bad Monkeys group, and that is who I now work for over here in Brisbane.

So how does BIMbeats work? So let me try and break this down in very, very simplified format. We install an MSI file on every local machine. So every user's machine has an MSI file, which installs an add-in to Revit and similarly across all of the other products, and it sits silently in the background and doesn't impact the user at all. So there isn't an add-in button in Revit. There's no ribbon. It's just an add-in that sits on top of Revit, but it sits silently and is capturing that data in real time.

So that data is then either Fed directly into Elasticsearch, or it goes to Logstash. So the Revit journal file, as an example, goes to Logstash. It's then aggregated and processed in Logstash. And then it's taken into Elasticsearch, which is the database. And then, it's Fed into Kibana or Kibana reads that database from a visualization business intelligence perspective.

Now, when we implemented this solution to start with, we were very familiar with using Power BI, and we were unfamiliar with Kibana. So what BIMbeats allows us to do is through a REST API call, call the Kibana or call the Elastic instance and then publish that to Power BI. So we have both options. We could use the Kibana interface, which was web based, or we could feed that into Power BI.

So this technology is proven across all of these large organizations so we weren't worried about scalability issues. People like Netflix obviously pumping heaps and heaps of data into this technology. So an organizational set of data for their BIM tools was pretty easy to handle.

So what we were able to do was capture, filter, and visualize. So what we're seeing on the left-hand side of the screen here is the Kibana interface, and this is the Discover tab. So this is reading in in real time depending on the index that you're looking at all of what's happening across the organization in real time.

So what we're looking at here is we're saying action name equals sync and give me the duration of each of those syncs, and then we can have the username in there, too. So we can see in real time how long it's taken those files to open by the user, and we can drill down into whatever metric we want, what file they're in, et cetera, et cetera.

Now that's the raw data input. On the right-hand side, we can take that data and create visualizations and dashboards. So the example that we have on the right here is our use of custom packages and nodes that we can drill down by user or by file to understand which custom packages are being used on which files.

So earlier I was talking about the alerting system. So what BIMbeats is able to do is to create a webhook into Teams or Slack or Trello or Zendesk if you use Zendesk Help Desk to track anything that you want in real time as an alert. So the example that's on the screen here is when a user has modified a View Template that is a company-wide View Template.

So this was a request that we had that users were going in and modifying templates. And if they needed to modify a template, it was quite likely that that change needed to be fed back into the main template so it was available for everybody. But when you make that change to the View Template here, it's affecting multiple users and can have detrimental effect on other people's documentation.

So we set this up as an alert so when that View Template's changed and applied, you'll see on the bottom right-hand side of the screen, there's an alert that pops up in Slack to say, hey, someone's just gone and changed this View Template. And then when you open up that alert, it will tell you who did it on what file and when.

On the right-hand side of the screen here is showing where companies have got existing workflows in place. So they've already got something set up similar to what BVN had with their Power BI dashboards, but now this is reading the live data from Kibana or from Elastic and feeding that directly into Power BI.

So if we start looking at exactly how it works, this would be a user working away in Revit. They create a field region, or it could be anything, and then as soon as they synchronize that change to the model, you'll see at the moment, it's saying there's zero masking regions in the Kibana interface.

They synchronize the file. They jump back into the Kibana interface. They refresh that view, and the number of masking regions goes from 0 to 1. So this is just an example of obviously any metric that's being captured within Revit that is a metric-based data, so number of instances of any families, number of types across any category.

That's recording that, and it's recording the file size, the warnings, the number of in-place families, all of that type of information in real time. And then, we can obviously sync that live to Power BI. So as soon as it goes back into Power BI, it's just a refresh. It does the API call, and then that information that was in Kibana is also in Power BI.

So one of the biggest challenges that we had at BVN was users saying that it took a long time to open Revit files. There would be times when people would say, hey, it's taking me like half an hour to open this file, and it's like it every day. And then you'd hear someone else from the other side of the studio say, well, I don't have that problem. It only opens in maybe five minutes for me. So we were like, well, OK, there could be a number of reasons why this is happening.

But the main reason we found was that it was when users were opening all of the work sets, it was taking up to 30 minutes, whereas when they were being selective about picking the work sets, generally, it would be maybe 5 or 10 minutes. So what this dashboard allows us to do is look at every single project, every single file, and every single user and understand the time it's taking to open those files, and most importantly, which users are opening all work sets versus those that are being selective over their work sets.

So the majority of cases when people were making this assumption or this claim that it was taking so long to open the model was because they were opening all those work sets. And then when we had a discussion with them around, well, what area are you working on in the model, we found out that the majority of times, they would only need to open up maybe seven or eight work sets in this particular example rather than all 23.

But when we first started looking at this, we found that on this particular project, which was a very large project, pretty much 75% of the users didn't have an awareness around the fact that only opening selected work sets had such a significant impact on the time it took to open that file.

So just having that transparency of understanding every file and every user and that behavior of not picking selected work sets reduced the amount of opening time on one file by four hours a week. So obviously, across every project across the organization this is this is a lot of downtime that was saved.

In terms of understanding the usage of our add-ins, this was something we really didn't have a good handle on. So every year when renewals would come up, we'd have to make a decision about whether we felt we should obviously get another year subscription. Or do we increase user licenses? Do we reduce user licenses?

So what we can do with BIMbeats is BIMbeats can track every feature that's used in every add-in. And then we can drill down and start understanding who's using those add-ins. So as an example, BIMlink is a great tool from Ideate, and we knew that there were power users. But we also were able to understand that people that were in roles that should be using BIMlink and should be using that process to reduce the manual effort, we could say, well, OK, these power users are obviously really competent. They understand how to use that.

When we resource a project, maybe there's an opportunity here to put a power user with someone who's less competent or doesn't know how to use the tool so they can work together on the project and learn on the job so that they increase their competency. And obviously, from a software purchasing point of view, we could work out whether or not it was actually worth purchasing some of these tools. Did we have a series of add-ins that we thought were being used that weren't actually being used?

Organizationally, BIMbeats captures every single process that's on the computer. So this extends beyond just the CAD and BIM tools. But in BVN's case, we were using SketchUp, and SketchUp had been in the organization for years, and we had 35 licenses of SketchUp.

Since moving to Revit, the usage of SketchUp was reducing, and more and more people were doing things in Revit that they used to do in SketchUp, but we weren't really recording that anywhere. And there was an assumption that we'd still needed the 35 licenses because we had 35 users.

And we sent a survey out saying, who are the people that still use SketchUp, and it came back with 35 people that said they used SketchUp. But what we found out in reality was that we actually only needed to buy seven SketchUp licenses. So we reduced the licensing from 35 to 7 because concurrently that was all we were ever using.

And similarly, the Adobe Suite, we were buying-- or not, there's the Creative Cloud Suite, which includes all of the products, but using BIMbeats we were able to find out who is actually using Photoshop and Illustrator and who was actually just using Adobe Reader, which was the free tool.

So this was really useful information about understanding from an organizational point of view outside of the CAD and BIM realm all of the other tools as well. And obviously, it tracks Revit so we could get a good understanding of who was logging in and who was using Revit at what time. And obviously, if people were still using MS Paint instead of Photoshop, which was quite alarming to find the number of people that still used Paint.

So when we went into COVID lockdown, like most of the world, we found that it was very hard to get a good understanding of what people were doing, when they were working, when they weren't working, and this wasn't from a point of view of checking up to make sure that people were doing their eight hours a day. This was more about trying to understand the new patterns, the new ways of working with people working from home. And we wanted to make sure that we were able to look at who was putting in long hours regularly every week.

So this dashboard here came in useful where all we're tracking here is the number of transactions in Revit across a project team to understand who's working long hours, who isn't working as longer hours or are they on other projects, and potentially, on this particular image, you can see when the BIM manager comes in on the weekend and audits the files and cleans them up. So the third line down there shows that.

But this was really good because, yeah, we didn't have that transparency of how the project teams were working that you would have when you're physically in an office. So this was really good to at least start a conversation with some of those people that were working long hours just to make sure that they were OK.

Recording crashes-- we at BVN had the Zendesk system, and we encouraged everybody to log a ticket every time Revit crashed. And we would have maybe one or two crashes a day logged in the system or probably not even that, maybe one or two a week. When we employed or when we implemented BIMbeats what we were able to do, because we're processing a journal file, we could find out how many fatal errors and how many unrecoverable errors were happening across the entire organization.

And we found out that approximately 10% of the unrecoverable and fatal errors were actually being recorded in the ticketing system versus what was actually happening. And generally, that would be the people that scream the loudest, the people that were whinging the most would be the people that would put in a ticket, whereas BIMbeats wasn't biased to the loudest person. BIMbeats was just extracting that information and being able to record exactly how many crashes were happening.

And that enabled us to do a number of things. One, we were able to get that alert set up straight away so that that could go into Zendesk and automatically create the ticket. But we could look at grouping. Was it the file? Was there three or four users having that same crash in the same day on that same file? What hardware was that user running compared to the other user? So we were able to really drill down and understand the root cause of why the crashes were happening rather than just isolated instances that we'd have to retrospectively go and look into.

And the other thing that BIMbeats does is that it captures the metrics of the PC at the time that these things were happening. So we could set up alerts which I'll show you later on where we could say, OK, if the memory of the machine exceeds 90% utilization, send an alert to the IT team so that they know that the user is doing something that is obviously pushing the system to the limit. And it might be that they were in too many versions of Revit, and they were trying to do too many things, and they needed educating around best practice, but it could also mean, well, that's what they need to do doing their job. Do they need more RAM?

From a model health point of view, and this is where I say there's other tools that are complementary, but what we could do was obviously track the number of in-place families on a project, and we can track the file size of those families, as an example. So what we would often hear is that users would say, oh, I've built an in-place family because I'm only using it once. It's a bespoke piece of casework. That's why I've not built it as a loadable family.

But what we're able to do here is obviously say, well, that bespoke piece of casework that you've just created, you've copied it 200 times in your model. So maybe it's better that that is a loadable family, and then we would have dashboards that would be set up that would provide links to training material. So for those users that didn't know how to build families, it would be, OK, we've identified that you've got 100 in-place families on this project. Here's a guide to show you how to build those families better.

And probably a good point to go onto here is that a lot of the information you see on the screen is blurred out, and I appreciate that's hard for you as the audience to understand exactly what's happening here, but the data that we're capturing in BIMbeats is obviously very sensitive, and that data is not shared outside of the organization. And obviously, I can't show some of this data during these slides.

So the instant alerting, this is an example of the very first instant alert we set up, and this was actually going into Trello. And this was when somebody imported a CAD file. So when somebody imported a CAD file rather than linking a CAD file, there are certain times when this is appropriate, but more often than not, it's because the user doesn't understand the impact of doing that. And worse still, we set up another one that's when you've exploded a CAD file.

So outside of Revit, we obviously are using BIMbeats for Dynamo as well and understanding the usage of Dynamo, understanding who's using Dynamo the most, to understand who are the people that can train other people. So this is now looking at Autonomation and the Bad Monkeys group.

So no surprise, Konrad is one of our power users. And then we can start drilling down into how those Dynamo scripts are used on projects by the file, by the script name over any period of time by the user. So this is really, really useful. If you have a set of company scripts that should be used to do certain tasks on all projects that you can check to make sure those scripts are running and to make sure that it's not just the same users that are executing those scripts.

Are they just the people that developed those scripts that are using it? And also, are there scripts out there that you're not aware of that people have developed that can be then rolled out company wide so that more people can understand the benefits of those tools? And are there multiple people building the same tool?

So from a deployment and training strategy perspective, we can drill down into the use of every single node in every single package. Now, this is obviously very, very detailed analysis, but it could be very, very useful when trying to build a training program to say, well, OK, I'm in Dynamo. What are the key nodes that I need to use?

So unsurprisingly in here with our team, this filter by Boolean mask is in there, list flatten, list create. So we can get a really good idea of what are those key nodes that we should be using when generating training programs both across the core package and the custom packages. And with custom packages, that will help inform our deployment strategy. What are the ones that we want to deploy across the entire organization? Are there certain packages that are only used by certain individuals doing certain tasks?

So all of this information is really, really good to minimize the amount of impact that having too many packages loaded on everyone's machine would have but also inform those training packages or training strategies. And no surprise here, when we drill into what Konrad's use of the custom packages, Konrad uses Archi-lab a lot. Lots of people use Archi-lab a lot, but Konrad uses it obviously.

So we're able to track how quickly the scripts run as well, minimum, maximum, and average times. So if there are scripts that are being used across the entire organization and they're appearing to run fairly slowly, there could be an opportunity to go in and optimize that script to make it run quicker. And we're capturing whether it actually runs or whether it doesn't. So did it execute successfully or did it not? And what's the difference between when a script is executed from Dynamo versus Dynamo Player?

So in our example, we very rarely use Dynamo Player because we're creating those scripts, but in a larger organization, does it make a difference if you then build a script and allow it to be used in Dynamo Player? And is the uptake of that script better if it's deployed that way? So in terms of your strategy around developing the tools, understanding how the users prefer to use the tool is really important, too.

So some of the other things that BIMbeats can do, if you're familiar with using the Revit Model Checker process, Revit Model Checker is in the BIM interoperability tools. What we're able to do is use that Excel document that's created as an export or as the result of that check, and then BIMbeats will automatically process that.

So we generate the Excel file as the report and save that somewhere, and then, we then process that by just moving that Excel file to a file on our C drive, which is Model Checker processing file. And as soon as we drop that Excel file into that folder, it automatically processes it and sucks it up into BIMbeats or into Kibana. So then you can then go into Kibana, and within a minute, it's then there to analyze.

And again, tracking over time, you can then have all of your Revit Model Checker data in for trending and create dashboards around that. So this is the check that I've just run, and then I've just said "show me anything that has pass or fail in there." We can filter by pass or fail, and it will run through that check and itemize those things. And obviously, this could then be put on a visualization which then sits on a dashboard to look at that trending over time using Revit Model Checker.

This was something I touched on earlier around measuring the effect the hardware has on performance of your users. So on the left-hand side here, this is the alert or Action section of Kibana where we can come in and say, OK, if the system memory usage is over 0.9 or 90%, send me an alert. So this is sending a Teams alert to our Teams channel to say, Matt's computer, which has got 16 gig of RAM, it got to 0.91 so 91%.

And then we can start drilling down and getting even more metrics around why that happened, when it happened, what I was doing at the time, and is it because I was running too many processes at the same time, or is it because that's what Matt needs to do his job so therefore, we need to up him to 32 gig of RAM, which is a request I've had into Adam for a while now. So I don't know if you're watching. Now I can get my 32gb of RAM. Thanks.

Navisworks, we process Navisworks files. This is all in real time and live as well as you're working in Navis. So as soon as you run a clash test within Navis, that's automatically sucked into BIMbeats, and then we can come in here and obviously filter the different search or clash tests and look at the counts and then feed that directly into Power BI if that was the existing workflow. So here, we're just looking at active, new, and resolved clashes, and this is obviously capturing this all in real time via Kibana and then being able to filter that in any way that we want.

Coming soon-- so this is a request that we've had around auditing models from a perspective of maybe ISO 19650 where there'd be certain requirements that certain parameters need to be filled out within the model. Now, rather than capturing every single parameter of every single instance of every single element in the model, we can specify templates or set up templates to process that data.

And this is one of our clients who said, we want to know whether or not elements are being modeled as they're going to be constructed. So as an example, we can say, well, OK, if you extract me the column category and give me the base and top level, we can see whether they are running level to level or whether they're running over multiple levels, as an example.

So this is something we're working on at the moment. It's pretty raw. We're doing it via just an Excel file and selecting the categories in those instances. But then BIMbeats is able to process that. And the intent here is that when you've got the template set up for company-specific requirements, that we'd automate that remotely so that you're not having to manually do that.

And that's another alert that has been set up by BVN, and that was that they use ITV tools and used the remote task ITV tool server to do a lot of their creation of PDFs or IFC files in the background so it doesn't impact the user. But what they were finding was that occasionally, the ITV tool server would stop because it would open up Revit, and it would prompt the user to either open or close or cancel or accept a dialog box that would pop up.

So what BVN did was they put BIMbeats onto the ITV tool server and then used the alert so that if any dialog box popped up, it sent them an alert in Teams so they knew that they had to go into that ITV tool server and then hit the button to say OK or Cancel, whereas having not had BIMbeats installed on the ITV tool server, they wouldn't know whether or not it had gone down unless they actually physically logged in. So that was really useful.

So in terms of data ownership and security, I think this is a really important point to touch on. I don't want to go into it too much detail, but BIMbeats as a company is not capturing any of that data and can't keep it for themselves. It's the data of the client. It can be hosted on your own premise. It can be hosted in the cloud or a combination of the two.

And from a licensing point of view, there's only a certain limited amount of data that is captured, and that's the IP address, the computer name, and the username, and it's fully GDPR compliant, and you own that data. BIMbeats does not collect or have access to any of those databases.

And then I guess lastly, when we rolled this out at BVN, there was a little bit of hesitation around this being very much big brother, and I guess it is in that it's capturing all of this data, but we wanted to make sure that the message was a positive one. And this was about increasing competency. It was about reducing downtime. And ultimately, if we can show that there's better ways and smarter ways of doing things and automate things, then it means that as a person working in a company, you can have more time with your family ultimately.

So it wasn't about this person's done this thing and we need to blame that person and they keep doing it. It was more around this continuous improvement opportunity to see who were the people that were really pushing the boundaries, who were the people that were the super users and recognize those people as well because a lot of the time, it wasn't visible the people that were really making a difference as well. So the message was definitely one that was a positive message, and it wasn't one to just check up on people.

So I appreciate that we've only got a limited amount of time for the presentation so we weren't able to cover some of these other tools, which we can hopefully do in the Q&A. So obviously, we didn't go through AutoCAD, BIM360, Bluebeam, BCF, IFC, the integration for Revit for Tally, Rhino, and Grasshopper, and then the management with FlexLM.

And the other thing that we're just starting to get into with Elastic and Kibana is the AI and machine learning capabilities. So there is an anomaly detection available in Kibana and Elastic so that's going to be really interesting to get into that when we start getting more and more data in there to see what the machine learning capabilities can do to help us predict future projects.

And the other one is timesheet validation. So BVN used Deltec Vision. So they've tied Deltec Vision and integrated that as part of BIMbeats, too, so that when timesheets are done at the end of the week, you can look at what people have actually put into their timesheet and use the data from BIMbeats to validate or at least check do those numbers match what the actual user was doing. Obviously, that's limited, and it's not a perfect science, but it at least gives a good idea of whether or not somebody did do the 40 hours, or whether they were doing 50 or 60 or whatever that might be.

So I'd like to say thank you for attending my class. Here's my contact details. Please get in contact if you've got any questions on anything that I went through. I would be happy to talk to you about it, or if you've got any feedback on things that you'd like to see BIMbeats do, please feel free to reach out.

______
icon-svg-close-thick

Cookie 首选项

您的隐私对我们非常重要,为您提供出色的体验是我们的责任。为了帮助自定义信息和构建应用程序,我们会收集有关您如何使用此站点的数据。

我们是否可以收集并使用您的数据?

详细了解我们使用的第三方服务以及我们的隐私声明

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

通过这些 Cookie,我们可以记录您的偏好或登录信息,响应您的请求或完成购物车中物品或服务的订购。

改善您的体验 – 使我们能够为您展示与您相关的内容

通过这些 Cookie,我们可以提供增强的功能和个性化服务。可能由我们或第三方提供商进行设置,我们会利用其服务为您提供定制的信息和体验。如果您不允许使用这些 Cookie,可能会无法使用某些或全部服务。

定制您的广告 – 允许我们为您提供针对性的广告

这些 Cookie 会根据您的活动和兴趣收集有关您的数据,以便向您显示相关广告并跟踪其效果。通过收集这些数据,我们可以更有针对性地向您显示与您的兴趣相关的广告。如果您不允许使用这些 Cookie,您看到的广告将缺乏针对性。

icon-svg-close-thick

第三方服务

详细了解每个类别中我们所用的第三方服务,以及我们如何使用所收集的与您的网络活动相关的数据。

icon-svg-hide-thick

icon-svg-show-thick

绝对必要 – 我们的网站正常运行并为您提供服务所必需的

Qualtrics
我们通过 Qualtrics 借助调查或联机表单获得您的反馈。您可能会被随机选定参与某项调查,或者您可以主动向我们提供反馈。填写调查之前,我们将收集数据以更好地了解您所执行的操作。这有助于我们解决您可能遇到的问题。. Qualtrics 隐私政策
Akamai mPulse
我们通过 Akamai mPulse 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Akamai mPulse 隐私政策
Digital River
我们通过 Digital River 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Digital River 隐私政策
Dynatrace
我们通过 Dynatrace 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Dynatrace 隐私政策
Khoros
我们通过 Khoros 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Khoros 隐私政策
Launch Darkly
我们通过 Launch Darkly 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Launch Darkly 隐私政策
New Relic
我们通过 New Relic 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. New Relic 隐私政策
Salesforce Live Agent
我们通过 Salesforce Live Agent 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Salesforce Live Agent 隐私政策
Wistia
我们通过 Wistia 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Wistia 隐私政策
Tealium
我们通过 Tealium 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Tealium 隐私政策
Upsellit
我们通过 Upsellit 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Upsellit 隐私政策
CJ Affiliates
我们通过 CJ Affiliates 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. CJ Affiliates 隐私政策
Commission Factory
我们通过 Commission Factory 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Commission Factory 隐私政策
Google Analytics (Strictly Necessary)
我们通过 Google Analytics (Strictly Necessary) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Strictly Necessary) 隐私政策
Typepad Stats
我们通过 Typepad Stats 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Typepad Stats 隐私政策
Geo Targetly
我们使用 Geo Targetly 将网站访问者引导至最合适的网页并/或根据他们的位置提供量身定制的内容。 Geo Targetly 使用网站访问者的 IP 地址确定访问者设备的大致位置。 这有助于确保访问者以其(最有可能的)本地语言浏览内容。Geo Targetly 隐私政策
SpeedCurve
我们使用 SpeedCurve 来监控和衡量您的网站体验的性能,具体因素为网页加载时间以及后续元素(如图像、脚本和文本)的响应能力。SpeedCurve 隐私政策
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

改善您的体验 – 使我们能够为您展示与您相关的内容

Google Optimize
我们通过 Google Optimize 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Google Optimize 隐私政策
ClickTale
我们通过 ClickTale 更好地了解您可能会在站点的哪些方面遇到困难。我们通过会话记录来帮助了解您与站点的交互方式,包括页面上的各种元素。将隐藏可能会识别个人身份的信息,而不会收集此信息。. ClickTale 隐私政策
OneSignal
我们通过 OneSignal 在 OneSignal 提供支持的站点上投放数字广告。根据 OneSignal 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 OneSignal 收集的与您相关的数据相整合。我们利用发送给 OneSignal 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. OneSignal 隐私政策
Optimizely
我们通过 Optimizely 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Optimizely 隐私政策
Amplitude
我们通过 Amplitude 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Amplitude 隐私政策
Snowplow
我们通过 Snowplow 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Snowplow 隐私政策
UserVoice
我们通过 UserVoice 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. UserVoice 隐私政策
Clearbit
Clearbit 允许实时数据扩充,为客户提供个性化且相关的体验。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。Clearbit 隐私政策
YouTube
YouTube 是一个视频共享平台,允许用户在我们的网站上查看和共享嵌入视频。YouTube 提供关于视频性能的观看指标。 YouTube 隐私政策

icon-svg-hide-thick

icon-svg-show-thick

定制您的广告 – 允许我们为您提供针对性的广告

Adobe Analytics
我们通过 Adobe Analytics 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Adobe Analytics 隐私政策
Google Analytics (Web Analytics)
我们通过 Google Analytics (Web Analytics) 收集与您在我们站点中的活动相关的数据。这可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。我们使用此数据来衡量我们站点的性能并评估联机体验的难易程度,以便我们改进相关功能。此外,我们还将使用高级分析方法来优化电子邮件体验、客户支持体验和销售体验。. Google Analytics (Web Analytics) 隐私政策
AdWords
我们通过 AdWords 在 AdWords 提供支持的站点上投放数字广告。根据 AdWords 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AdWords 收集的与您相关的数据相整合。我们利用发送给 AdWords 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AdWords 隐私政策
Marketo
我们通过 Marketo 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。我们可能会将此数据与从其他信息源收集的数据相整合,以根据高级分析处理方法向您提供改进的销售体验或客户服务体验以及更相关的内容。. Marketo 隐私政策
Doubleclick
我们通过 Doubleclick 在 Doubleclick 提供支持的站点上投放数字广告。根据 Doubleclick 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Doubleclick 收集的与您相关的数据相整合。我们利用发送给 Doubleclick 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Doubleclick 隐私政策
HubSpot
我们通过 HubSpot 更及时地向您发送相关电子邮件内容。为此,我们收集与以下各项相关的数据:您的网络活动,您对我们所发送电子邮件的响应。收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、电子邮件打开率、单击的链接等。. HubSpot 隐私政策
Twitter
我们通过 Twitter 在 Twitter 提供支持的站点上投放数字广告。根据 Twitter 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Twitter 收集的与您相关的数据相整合。我们利用发送给 Twitter 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Twitter 隐私政策
Facebook
我们通过 Facebook 在 Facebook 提供支持的站点上投放数字广告。根据 Facebook 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Facebook 收集的与您相关的数据相整合。我们利用发送给 Facebook 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Facebook 隐私政策
LinkedIn
我们通过 LinkedIn 在 LinkedIn 提供支持的站点上投放数字广告。根据 LinkedIn 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 LinkedIn 收集的与您相关的数据相整合。我们利用发送给 LinkedIn 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. LinkedIn 隐私政策
Yahoo! Japan
我们通过 Yahoo! Japan 在 Yahoo! Japan 提供支持的站点上投放数字广告。根据 Yahoo! Japan 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Yahoo! Japan 收集的与您相关的数据相整合。我们利用发送给 Yahoo! Japan 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Yahoo! Japan 隐私政策
Naver
我们通过 Naver 在 Naver 提供支持的站点上投放数字广告。根据 Naver 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Naver 收集的与您相关的数据相整合。我们利用发送给 Naver 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Naver 隐私政策
Quantcast
我们通过 Quantcast 在 Quantcast 提供支持的站点上投放数字广告。根据 Quantcast 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Quantcast 收集的与您相关的数据相整合。我们利用发送给 Quantcast 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Quantcast 隐私政策
Call Tracking
我们通过 Call Tracking 为推广活动提供专属的电话号码。从而,使您可以更快地联系我们的支持人员并帮助我们更精确地评估我们的表现。我们可能会通过提供的电话号码收集与您在站点中的活动相关的数据。. Call Tracking 隐私政策
Wunderkind
我们通过 Wunderkind 在 Wunderkind 提供支持的站点上投放数字广告。根据 Wunderkind 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Wunderkind 收集的与您相关的数据相整合。我们利用发送给 Wunderkind 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Wunderkind 隐私政策
ADC Media
我们通过 ADC Media 在 ADC Media 提供支持的站点上投放数字广告。根据 ADC Media 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 ADC Media 收集的与您相关的数据相整合。我们利用发送给 ADC Media 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. ADC Media 隐私政策
AgrantSEM
我们通过 AgrantSEM 在 AgrantSEM 提供支持的站点上投放数字广告。根据 AgrantSEM 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 AgrantSEM 收集的与您相关的数据相整合。我们利用发送给 AgrantSEM 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. AgrantSEM 隐私政策
Bidtellect
我们通过 Bidtellect 在 Bidtellect 提供支持的站点上投放数字广告。根据 Bidtellect 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bidtellect 收集的与您相关的数据相整合。我们利用发送给 Bidtellect 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bidtellect 隐私政策
Bing
我们通过 Bing 在 Bing 提供支持的站点上投放数字广告。根据 Bing 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Bing 收集的与您相关的数据相整合。我们利用发送给 Bing 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Bing 隐私政策
G2Crowd
我们通过 G2Crowd 在 G2Crowd 提供支持的站点上投放数字广告。根据 G2Crowd 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 G2Crowd 收集的与您相关的数据相整合。我们利用发送给 G2Crowd 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. G2Crowd 隐私政策
NMPI Display
我们通过 NMPI Display 在 NMPI Display 提供支持的站点上投放数字广告。根据 NMPI Display 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 NMPI Display 收集的与您相关的数据相整合。我们利用发送给 NMPI Display 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. NMPI Display 隐私政策
VK
我们通过 VK 在 VK 提供支持的站点上投放数字广告。根据 VK 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 VK 收集的与您相关的数据相整合。我们利用发送给 VK 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. VK 隐私政策
Adobe Target
我们通过 Adobe Target 测试站点上的新功能并自定义您对这些功能的体验。为此,我们将收集与您在站点中的活动相关的数据。此数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID、您的 Autodesk ID 等。根据功能测试,您可能会体验不同版本的站点;或者,根据访问者属性,您可能会查看个性化内容。. Adobe Target 隐私政策
Google Analytics (Advertising)
我们通过 Google Analytics (Advertising) 在 Google Analytics (Advertising) 提供支持的站点上投放数字广告。根据 Google Analytics (Advertising) 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Google Analytics (Advertising) 收集的与您相关的数据相整合。我们利用发送给 Google Analytics (Advertising) 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Google Analytics (Advertising) 隐私政策
Trendkite
我们通过 Trendkite 在 Trendkite 提供支持的站点上投放数字广告。根据 Trendkite 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Trendkite 收集的与您相关的数据相整合。我们利用发送给 Trendkite 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Trendkite 隐私政策
Hotjar
我们通过 Hotjar 在 Hotjar 提供支持的站点上投放数字广告。根据 Hotjar 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Hotjar 收集的与您相关的数据相整合。我们利用发送给 Hotjar 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Hotjar 隐私政策
6 Sense
我们通过 6 Sense 在 6 Sense 提供支持的站点上投放数字广告。根据 6 Sense 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 6 Sense 收集的与您相关的数据相整合。我们利用发送给 6 Sense 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. 6 Sense 隐私政策
Terminus
我们通过 Terminus 在 Terminus 提供支持的站点上投放数字广告。根据 Terminus 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 Terminus 收集的与您相关的数据相整合。我们利用发送给 Terminus 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. Terminus 隐私政策
StackAdapt
我们通过 StackAdapt 在 StackAdapt 提供支持的站点上投放数字广告。根据 StackAdapt 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 StackAdapt 收集的与您相关的数据相整合。我们利用发送给 StackAdapt 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. StackAdapt 隐私政策
The Trade Desk
我们通过 The Trade Desk 在 The Trade Desk 提供支持的站点上投放数字广告。根据 The Trade Desk 数据以及我们收集的与您在站点中的活动相关的数据,有针对性地提供广告。我们收集的数据可能包含您访问的页面、您启动的试用版、您播放的视频、您购买的东西、您的 IP 地址或设备 ID。可能会将此信息与 The Trade Desk 收集的与您相关的数据相整合。我们利用发送给 The Trade Desk 的数据为您提供更具个性化的数字广告体验并向您展现相关性更强的广告。. The Trade Desk 隐私政策
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

是否确定要简化联机体验?

我们希望您能够从我们这里获得良好体验。对于上一屏幕中的类别,如果选择“是”,我们将收集并使用您的数据以自定义您的体验并为您构建更好的应用程序。您可以访问我们的“隐私声明”,根据需要更改您的设置。

个性化您的体验,选择由您来做。

我们重视隐私权。我们收集的数据可以帮助我们了解您对我们产品的使用情况、您可能感兴趣的信息以及我们可以在哪些方面做出改善以使您与 Autodesk 的沟通更为顺畅。

我们是否可以收集并使用您的数据,从而为您打造个性化的体验?

通过管理您在此站点的隐私设置来了解个性化体验的好处,或访问我们的隐私声明详细了解您的可用选项。