AU Class
AU Class
class - AU

Leveraging Analytics and Data Pipelines with Toric

Share this class
Search for keywords in videos, presentation slides and handouts:

Description

Join this session to learn how to utilize Toric, the construction industry's trusted no-code platform for data movement and pipelines. Discover how to automate ETL processes, migrate data, and execute full table backups without coding. Using Toric's Autodesk Construction Cloud connector, you can access project data and extract insights in real-time. We'll also delve into practical case studies from Gamuda and Commodore, illustrating how they've optimized data transfer across APIs, applications, and databases with Toric. By the end, you'll be equipped with concrete techniques to enhance your workflows and ensure efficient data integration.

Key Learnings

  • Measurable and brief, learning objectives relate to skills, tasks, and knowledge to be gained. After consuming your content, participants will be able to.
  • Extract data from Autodesk products (Autodesk Construction Cloud, Revit, BuildingConnected, eg.).
  • Easily transform data sources and create data pipelines without writing any code.
  • Automate workflows to load data into a destination of choice such as a data lake or warehouse.

Speakers

  • Chad Braun
    Having been familiar with construction from early life, Chad graduated from Colorado State University with a degree in Construction Management. From college, he joined a nationwide specialty contractor, where he eventually worked his way up to a Project Manager / Estimator position. It didn't take long for Chad to recognize the shortcomings of technology in construction, which eventually led him to make the jump to Autodesk, where he spent 5 years as a Technical Solutions Executive helping customers on their construction journey, specializing in multiple legacy tools and helping to optimize new ones. Chad recently made the jump to Toric after observing the very prevalent lack of standardized data and analytics that most of his customers were struggling with.
  • Austin Wolff
    Austin is a Data Engineer with 3 years of experience and is a certified AWS Cloud Practitioner.
Video Player is loading.
Current Time 0:00
Duration 44:38
Loaded: 0.37%
Stream Type LIVE
Remaining Time 44:38
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

CHAD BRAUN: Hi. Thanks for joining us for our presentation here today about Leveraging Analytics and Data within Toric. My name is Chad Braun. We'll do some short introductions, of course, after we do some housekeeping.

First of which, of course, is the safe harbor statement. We are going to be showing things that are kind of future-forward. So at the end of the presentation, of course, just make sure that we understand that this is not to be shared outside of necessarily today's presentation with third parties.

So what we're going to talk about today or really cover is just in general what data pipelining data transformations and really what data strategies we're seeing implemented within the construction technology front. Obviously, construction is an incredibly varied industry, a very fragmented industry when it comes to technology. We're going to talk about use cases for the what, the why, where, or the how. Data is starting to transform our construction industry as we know it and start to allow us to leverage analytics and make data-driven decisions.

So just a little bit about us here. My name is Chad Braun. I'm a solutions engineer with Toric. I have a construction background, so I was a project manager and estimator for a specialty contractor before spending six years at Autodesk helping them build out the Autodesk Construction Cloud.

I've been with Toric for about six months at this point helping customers like yourself understand what it is data should be doing for them, especially in regards to the construction-specific sources, be it the project management tools, the ERP tools, the scheduling tools. All of those data silos and really how to best utilize or break down the walls in between those disparate technologies. Austin?

AUSTIN WOLFF: Yeah. Hi, guys. My name is Austin Wolff I'm a data engineer here at Toric and I have a background in data science and data engineering. And my main role here at Toric is to help build data pipelines for our clients. So I'm happy to help. Pass it back to you, Chad.

CHAD BRAUN: Great. Thanks. So on the docket today, we're going to really start at the beginning, the impetus for construction problems. What are we seeing with data? Where is it falling to pieces? Where could we be doing a better job? How did we get to now, especially?

Obviously, construction has a data problem, much like every other industry. But construction being, how do I say, lacking when it comes to technology, at least by-- I like to say probably about five years behind the other industries. It's really a conversation about what we're starting to see happen here are really a Renaissance of thought when we're thinking about our construction specific data.

We'll talk about what Toric is, in particular relations to how it relates to Autodesk, actually. Why did it come to be? Why did we choose construction? And then we'll actually jump into a product demonstration.

And we're going to cover a plethora of different tool sets within Toric that are all going to-- or all being implemented to help you improve your tech stack, really, with Toric acting behind the scenes as a catch-all for that construction data, be it conversing about data pipelines or full table backups, even utilizing a warehouse, or Toric's built-in visualizations. And then, of course, actually syncing from source to source, if that's an option for your teams, or if that's something that would help enable your teams to make better data-driven decisions. And then last but not least, we'll finish up with some actual customer use cases for how people are utilizing the tool itself.

So when we're talking about the problem, or problems with construction data, as we all know, construction is complicated. Even compared to other verticals in other industries, we build very complex projects. We have very complex teams. We have very complex parties on all of these construction projects.

Obviously, in no certain order, subcontractors, when you get into the super subcontractors, the general contractors, the architects, the engineers, and that's not even including the consultants. All of those teams have multiple sources of data. All of these teams have or employ different technologies, even within one firm, especially when we're talking about something like a general contractor employing anywhere from sometimes even five to 10 technologies on any given project. And of course, the projects themselves are complicated and diverse. We always joke that you could build the same building in the same spot twice and it would be a totally different project.

And then, of course in relation to that, construction being tech averse. This is starting to change. We are seeing a bit of a change in the thought process when it comes to technology. But certainly that's really only taking place over the last 10 years. Only now are we starting to see, really, a new young workforce start to implement or push for more and more technology, understanding the benefit that it poses.

As such, of course, construction, again, has been behind the curve when it comes to the actual data collection, and we're finally starting to see the data collection happen. We have our ERP, we have our project management tools, we have our CRMs, we have our safety suites. But in all of those toolsets, what we often see is that the data hits a proverbial wall. In between each of these data silos, the data kind of falls off. And so what Toric is really helping customers with and what we're starting to see happen is a proliferation of the data itself in that collecting and doing what we need to do with the data is becoming more and more important.

We have the ability to farm the data and gather the data, but we don't necessarily have a place to store it all or make data decisions based off, of course, all of these different kind of collection points, if you will, and all of these different sources. And then, of course, construction is varied. Everything changes all the time. In that, we see all of these different tools. Everybody wants to be a single source of truth when it comes to construction technology.

But to be perfectly Frank about it, there's always a new tool. There's always something that's going to evolve beyond that quote unquote, "common data environment," or single source of truth. So that's really where Toric comes into play, is to act as a catch all for that construction data as it changes, as it morphs and evolves, always being there to make sure that you have the data you need when you need it, regardless of the tool sets you have at your fingertips.

So what is Toric? It's important to start at the beginning. Our founder and co-founder are Thiago and Dov.

They founded a company called Lagoa. And most importantly to this conversation, specific to Autodesk University, is that Lagoa was actually acquired by Autodesk. And both Thiago and Dov were really the men behind the Forge platform, what has now become Autodesk platform services.

So in doing so, they realized that there is an ample need for the ability to break down all of these data silos, the ability to bucket, or put all of this construction-specific data together, be it in a warehouse or even just for visualization purposes. Seeing that explicit need, or frankly the communication breakdown, not only, again, between parties, but between construction-specific softwares, which is where Toric really comes into play.

So when we're looking at Toric as a whole, on the left-hand side, you'll see that we have over 75 construction-specific sources. And that's not to exclude those that aren't construction-specific. Things like ERPs, CRMs like Salesforce, and a multitude of other tool sets.

The idea being that we can gather all of these data from all of these specific endpoints and automate an ingestion in real time, then transform and cleanse that data to get it analytics ready, so that on the opposite end it's prepared, normalized, or cleansed for its eventual journey to a data warehouse, or even writing from application to application. Or we actually have an analytics and BI tool in-house within Toric itself. So there's a lot of benefits to this one-stop shop of data.

Specific to Autodesk, what we're talking about is things that you're seeing on your screen. Obviously with recent acquisitions like PlanGrid, BuildingConnected, Assemble Toric does its best to keep up with these acquisitions and make sure that we're equipping your Autodesk-specific teams with all of the data from all of these disparate tool sets, most notably, of course, the Autodesk Construction Cloud, and included in that, of course BIM 360. But also things like being able to plug into PlanGrid or BuildingConnected for, of course, your preconstruction bid leveling, bid scoping, bid packaging data in addition to actually being able to plug-in directly to your Revit, your Navisworks, and even your Civil 3D.

So again, right off the bat, hopefully you're starting to see a benefit of being able to plug into even just these Autodesk-specific sources, harvest this data or farm this data, get it all in a bucket, and then send it to its eventual destination. So with that being said, now I'll pass it over to Austin for an actual product demonstration.

AUSTIN WOLFF: All right, guys. Today we're going to build a data pipeline from scratch and get your Autodesk data into whatever destination you want, whether that's a data table, data warehouse, data lake. So the first thing that I want to do is I want to make sure that Toric can connect to your Autodesk account.

So on the left-hand side, I'm going to go ahead and click on connectors here in the Toric software. And you can see here we have a lot of different connectors, different ways to access data from all of your different softwares. And then we also have a lot of databases and data lake connectors as well to push the data once you've gotten it from Autodesk to your final destination such as Aurora, AWS, Azure data lake, Databricks, Snowflake, so on and so forth.

We also have a lot of construction data connectors as well, as you can see on the screen here. Autodesk is the main one that we'll be covering today. But we have quite a few different connectors as well. We have connectors for spreadsheets, marketing and sales, finance, file storage, payments, also workforce planning as well.

But the one we are covering today is getting your data from Autodesk to your final destination. So I'm just going to do a Control-F, look for Autodesk, and click Set Up Connector. As you can see here, I've already set up a connector right here called New ACC Configuration. But all you would have to do is click Plus Create a Connector, give it a name, and then log in to your account. And that's all you have to do to make sure that Toric can start downloading your data from Autodesk.

Next is we are going to create what's called a project folder inside of Toric. Once we've connected to your Autodesk account, we have to have a location to then download that data into. So we're going to be downloading that data into our project folder. So I'll go ahead and click on Projects.

I'm going to create a project called SEC demo. We can name it whatever you want. Next is I'm going to be creating what's called a data flow.

A data flow is essentially just a data pipeline. All it is we're going to have the ability to download your data from ACC. We will then be able to transform it and then push it into your destination. So ETL extract, transform, load it to your destination.

So I'm going to click on New and I'm going to click on See All Connectors. I'm going to look for my Autodesk. Perfect.

I'm going to select my connector. So what I'm doing here is I'm selecting the connector I've already set up. That way, I can start downloading data from Autodesk. Give it a moment to load and to establish that connection with Autodesk.

Now as far as channel goes, this channel was essentially all the different ways we can download data from Autodesk's API. Let's look at forms just for a second. And with form, we can select your projects and we'll be able to select all of your form templates as well. You can select all form templates that have been updated after a specific date or not. You can just download all form templates that you have in your account as well. So that's something that we can do.

Another channel that I'll be specifically downloading data from is the project channel. And the project channel allows us to download a specific data based on what's called an end point. So just as an example, what we'll be working with today is forms data. So this is all of the forms data that is in each of your Autodesk projects.

Now here with project list, I can select certain projects to import or if I just leave empty, it'll automatically download all of them. So I'm going to be doing that. The other thing that I want to show you guys is incremental import.

So a lot of customers that we've talked to, they only want to download data that is either new or updated. So let's say that we're doing a daily intake of their data from Autodesk. They don't want to download all of their data every single day. It's efficient and it also costs money.

So one thing that we can do is we can do incremental import, and this will make sure that every single day when we run the data import, it checks the previous day and only imports data that had been changed or is new from the last time we ran the automation. So it's a pretty cool feature. Saves you time, money. It's more efficient.

I'm going to go ahead and click on Import Data. So what it's doing right now is it's first establishing that connection to Autodesk. It's getting ready to import the forms data from Autodesk. And then once it's done, it will create a blank data flow for us to start transforming that data.

But I speak with a lot of clients that when they're pushing their data to a data lake, sometimes they want it transformed, sometimes they don't. If we're pushing the data to a data warehouse, any of your tables in your data warehouse such as Azure SQL warehouse, all the time, every single time we are transforming that data, even if it's as simple as just defining the schema that we need, import that data into the data table.

As you can see here, it has finished importing that data into our workspace. Now this is going to be a lot of data, if you've never seen it before. So I know it can be like drinking from a fire hose. So not only am I going to attempt to go as slow as possible, but I'm only going to show the most important things when it comes to building your data pipeline. I'm not going to go over everything. That's not the purpose of this demonstration.

First thing I'm going to do is I'm just going to rename this to ACC to warehouse just so I what this data flow does. The next thing I'm going to do is I'm going to x this out to expand my screen. And as you can see here, you can see the data from the form's endpoint for one of your projects here.

So we have the assignee column, createdAt. We have custom values that are within an array, which I'll go over in a moment, form template, name, ID, notes, description, so on and so forth. Next I'm going to do is I'm going to go over here, click on this little tab called model root. If I click on that, it'll show me what's called a node. And this node is your source data.

So this is the source data for a project-- well, we're calling it sample project in our sample account, but Seaport Civic Center is the name of our sample project. But if you were to use this, you would start to see a list of all of your different projects that you've been able to import into Toric as well.

So I just click on this. As you can see here, I can take a look at our data. If I click out of it, there's no data to show. But if I click on this, OK, this is the data associated with this source node.

Let's get into transforming the actual data. This is the fun part. On the Overview little panel right here, I'm going to click on this double arrow to shrink it, expand my view.

You don't have to use a graph-based approach when you're transforming your data. I personally like to use a graph based approach when I'm doing data engineering, mostly because I want to visually see where my data is going from step to step. It just helps me out visually. So this is why I like using graph approach.

When I say graph approach, I mean every node you'll see is connected with essentially-- you just call it a line. The correct term is directed acyclic graph, but just call it a graph.

What I'm doing here is I am dragging the data from our source node over. I'm going to create a new node. We have a whole list of nodes here. I'm not going to go through each and every one, just a few of the highlights. But every single node does something different to transform your data.

So breakout allows you to just select certain columns to transform or keep. Filter, we can filter your data, such as give me all rows where createdAt is after 7/13/2023. You can find and replace.

So one example is, OK, find every row where created by is this long string and replace it with something else, group by, so on, and so forth. Again, I'm not going to go through all of them, but I'll go through a few specific ones here that might be relevant to showing you how else you can transform your data.

The first I'm going to do is data tagging. And what data tagging allows you to do is it allows you to create a new column and fill it with a specific field based on the rows of another column. So let's say, for example, I want to create a new column that has, just as an example, A, B, or C, based on what's in this column form template name.

In fact, actually, I'm going to call this form template one. You can give it a better name. I'm not going to give it a default value, but I will say let's fill it with the word "incident" when form template name is, and we have a dropdown if our column is in list format, incident report.

So we're going to create this new column called form template one. And if the column-- and if the rows in form template name equal this result, it gets filled with whatever I've selected here, incident. I'm going to go ahead and do that for the rest of this. Call this timesheet, call this incident report.

And we are going to tag the value timesheet or form template is equal to timesheet. And we'll call this daily report, where form template name is equal to daily report.

So I'm on this node right here. We can see the data for this node. And if I scroll over to the right, you'll see the new column that I've created.

So that's one example of data tagging. I'm sure, even as you're watching this, you can think of other ways to use it. It's very helpful for me when it comes to data engineering.

The next thing I'm going to do is show you how to do a join. So I'm going to take my data output and drag it over and look for the join node. And we'll zoom in just a little bit. So with the join, you need two inputs.

Typically what we'll do, for example, is you can either use two source nodes. I think that would be the quickest example. So let's say you needed to merge forms data with any other type of data. You would make sure to import that data into the project and then you would just drag it into port B of the join.

With the joins, we have a lot of options, left outer, right outer, inner. The main ones that clients like to use that I've seen are left outer and your inner join. It's essentially a one to one match. I'm going to get rid of this. We're not going to be joining today.

Another note I want to demonstrate for you is called Edit Height. So if I type in Edit, click on Edit column, you'll see here that there are a lot of different icons here next to the names of the columns. This tells us what type of data it is. So this pound symbol tells us that this column, form num, is a number column.

This calendar icon tells us it's a calendar-- sorry, it's date time. This brackets tells us it's an array. This little dropdown icon tells us it's a list, so on and so forth.

The T stands for text or string. And we can change the types of each of these columns to whatever we want. Unless you have a string and you're trying to convert it to a number, it's probably not going to work out.

I'm going to click on form template name. Right now, it's called a list type, so it allows us to select dropdowns. But it's not a string, so I can't form any string transformations on it. What if I wanted to? What if I want to start doing a regular expression extract of this, or just start to normally clean the string?

Can't do that if it's a list type. So I'm going to give it the string type. And as you can see here, it has changed from list to string. Easy enough, right? Now I want to do some text cleaning on it.

What if you wanted to use regular expressions? As you can see here, I've dragged this output into a new node. I'm looking for a regex extract.

If you don't know what regular expressions are, they're essentially just a way for you to match patterns within text data. And you can extract, you can clean, you can do whatever you want. But essentially what it is looking for is patterns.

I'm going to select a column to do my extraction and look for form template name. And what if I just wanted to extract the middle word of this sentence, regardless of what it is? I'm going to create a new column called form template.

Actually, you know what? I'm just going to call it regex extract. That way we explicitly know what we're looking for. And now we actually need our regular expression.

How do we to extract the middle word from form 10? Well, at this point, you need knowledge of regular expressions before you can do anything. So this is not a regular expression course.

I'm not going to go over it I personally use a website called regex101.com. That is my opinion. Does not represent the opinions of Autodesk or Toric. But again, this is the website I personally like to use to make regular expressions that match the text that I'm trying to match.

So again, I'll quickly go through this right now because again, this is not a regular expression tutorial. If you do know how to use regular expressions, this is a helpful tool that you can use. So again, I'm copying a regular expression I built that is meant to only match the middle word of our text in this column. Copy that, go over here to expression, I'm going to paste that.

For my flags, you can use other flags as well. I'm going to use case insensitive. Let's take a look. OK, it didn't match my regular expression right here. That's OK.

One thing that I can also do is take a look and troubleshoot why it didn't work, so on and so forth. But if you didn't want to use regular expressions, you don't have to. One thing that you can do, let's say, for example, that you don't want to figure out the pattern for your regex extract, oh I see here. I put daily instead of sample. Let's see if that worked.

All right, there you go. So again, with regular expressions, they're a little complex, and you do need to pay attention to detail, as you can see that I needed to do there. But if your extraction is relatively simple, let's say, for example, that I don't want to use regular expressions to extract this data, what else can I do?

Well, as you can see here, each middle word is separated by a space on the left and the space on the right. So if you've ever done splitting of text data, you can do that as well. So let's get into that.

I'm going to delete our regular expression extract, drag our output, click on a split node, and now I can actually split our data-- our text-based data, based on any character that I want. So I'm going to select form template name. Our delimiter is going to be just a space.

And I'm going to remove the original column. And as you can see here, it has split up by form template name into each of the words that are split based on the space. So I have the first word, the second word, and the third word, all split up by spaces.

And let's say I only want to keep this column. Now what I can do is I can do a new node called columns. I can actually hide columns I don't want. So I don't want this column and I don't want this column. I just wanted that middle word.

The other thing I can do is give it a new name. So I click on this. And let's just call it, just for simplicity, I know you wouldn't use this in real life, but just for the purposes of demonstration, I'll just call it middle word. There we go. And there you go. That's a demonstration of how to extract words from your text.

The next thing I want to do is demonstrate writing to a warehouse. So first I'm going to drag the output of my data and create a new node called right table. Now when it comes to this, we have to set up our tables ahead of time.

How do we do that? Well, I'm going to go into a new tab. I'm going to go to my connectors and let's say you want to connect it to your Azure SQL database. You find Azure SQL and you would set up your connector to your actual database here. You'd do the same for Snowflake, for Databricks, so on and so forth.

And once you set up your connection to your table-- to your warehouse, you can actually connect to your tables inside of your external warehouse. Toric also has the ability to create internal tables as well, just for you to store data, whether you need it to access it in a different data flow, or if you actually want to store your data inside of Toric, we have that ability as well. So I've created an internal table for this demonstration called ACC Forms. Just going to click on that. And as you can see here, I can quickly take a look at the schema for this table.

I'm going to go back into my data flow or my destination table, as you can see here, ACC forms is the one that I showed you just now. You can also connect to Databricks, Snowflake, Azure SQL database, so on and so forth. But let's just say you wanted to write data to your table. You would select it. The schema, columns for your data table, would appear, and then you need to actually map your data from your node, make it match the schema of your table. So I'm going to look for ID for the ID column. Great.

We know this column means the assignee ID. I'm going to look for that. Going to look for created by. And here I'm going to look for description.

And all I have to do is click on write to test it. Great. Now I know it works. Now, I know I can push my data to this data table.

Another thing you can do is let's say you're just writing your data to a data lake. Just for demonstration, I'm going to drag this output. And I'm going to select something called Run Export Automation. So all an export automation is it's a way for you to export that data to a data lake. It's one more step that you have to do. Instead of just writing to a table, you do have to run the export automation.

What is an export automation? All I'm doing is I'm taking a connection to our data lake, and I'll show you that we have one. So let's say I'm exporting to Azure data lake storage. I have a connection here. Now I need to create an export automation to make sure that data is run through our connector into the data lake.

So if I go to automations, scroll down a little bit, we have one called Azure lake export. As you can see here, I have the name of the automation description, the action type, which is exporting data, the application is data lake storage, the connector is the one I just showed you, and our channels files is all we get for Azure data lake.

And this is it. This is all I need to do to make sure that data is exported into Azure data lake. I'm going to go back to our node. We have our own export automation. I'm going to search for the name of our export automation called Azure Lake Export. Select that.

For the file name, we can type the file name in here. We can also create what's called a text node. And also give it a filename here, testing.csv. Then I can just connect the file name there. Great.

It's compressed. And when I click Export, it'll also be exported into my data lake. I don't want to do that right now. I don't want to clog my data like that. But all you have to do is click on Export.

Now let's say you want to do a full table backup. Most of the clients that I work with that I'm building data pipelines for, they want all of their data backed up into their destination of choice. So they want to be able to take all their data from Autodesk and then back it up to their own secret warehouse. How do we do that?

All you have to do is set up your data flow as I've done like this. Click on Export when automating. What we're going to do is we're going to automate this data flow. We're going to make it so you're importing data every single day, and every single time a file is imported into this data flow, it is run through it and then exported into your data table.

So what we do? We have to create a data flow automation. Again, we are just automating this data flow, that way every single time we import data into it from Autodesk, it's automatically run through this. You don't have to open it. You can just sit back and watch the data populate in your data warehouse.

So there's one more thing we need to do. I'm going to go to Automations, create an automation. I'm going to call this ACC to warehouse table flow automation. Great.

And I feel like that's pretty precise. I don't need to give it a more precise description. Going to use this name for the description. For the trigger type, I'm going to call it source updated.

All source updated is, is it's looking for new files. So every single time we import data from Autodesk, it's going to be looking for updated sources. So it's going to be looking for new files. That's all you can-- that's all you really have to think about when it comes to source updating. Think New file, essentially.

The source type filter, we need to know what data to look out for. I'm going to be looking for Autodesk. There we go.

So we're looking for new files from Autodesk. In what project? ACC demo. So all files that are imported into that project folder that we created at the beginning, that's our trigger type. The moment it sees a new Autodesk file in our project folder, it's going to do our action.

Our action is run data flow. We want to run that file through this data flow. I need to select the actual project data flow's in, ACC demo, and the name of our data flow is ACC warehouse.

Next we need to tell our data flow automation where the file is going to get inputted into in our data flow. And that is our source node. So all files that are coming from Autodesk, we want to put it right here into our source node. That way, it's run through the data flow and finally export it to our table.

So the port has a long name that is essentially the name of our source node, and then there are just a few default values we have to fill in. And then all we have to do is create our automation and enable it. And that is it.

Every single time we ingest data from ACC now, we can set it up on a daily timer, that data will get imported, run through the data flow, and exported to your data table. And that is how we can create a data pipeline from scratch and fully back up your data into your destination of choice. And that's it. I'll pass it back to Chad now.

CHAD BRAUN: Just a couple last notes in the actual demonstration environment, guys. First of which is that, of course, when we're talking about data pipelining, that's, in most cases, writing of course, from a source or multiple sources into a warehouse. But because Toric is an agnostic data movement tool, we can actually move data from source to source.

So what that means is what you're looking at here, in a similar capacity to what Austin was just performing with data pipelines, we're actually taking data from, in this case, a data warehouse and writing it back into the Autodesk Construction Cloud. So we use the same nodes in the same way. We're really just transforming the data and getting it prepared or put into a schema that matches the requirements as dictated by the Autodesk Construction Cloud. So in this case for RFIs, if we were to write RFIs from a system to ACC, we would just need the container ID, which is actually just the project unique ID, the status, and of course, the title.

Now, this isn't limited to writing from a warehouse. That's just the example that I've got here. You can actually write from a project management tool to ACC. You could write from ACC to an ERP. The only limitation of this tool set is the ability to-- or having the actual APIs available from whatever it is we're writing from and whatever we're writing into.

So there's some pretty cool use cases here. We'll actually touch on one here in just a moment when we talk about customer stories. But just understand that again, with the data being agnostic within Toric itself, the transformations are really at your fingertips to perform whatever you'd like to do, be it writing to a warehouse or writing to a different source, if you'd like to, writing to a table, as well as, of course, utilizing these transformations for a visualization, which, if you were to utilize something like Toric, should be mentioned that there is actually a visualization or BI tool built out within the system that enables your teams to actually associate model data with other data sets.

So in this example here, I've got a project phasing type build out for my model. So I can actually click in here and click through my five phases. As I do so, the model will start to build and change accordingly.

This is great not only for owners and developers to watch their building being built in real time, but of course, also for presentation purposes. If you're a general contractor trying to win new business, enabling your clients or your owners to actually be able to interact with the model and understand where they might be at any given time during the actual project progress.

Of course, this isn't limited to project phasing. If you have cost codes and you want to tie those to or associate those with families or elements within a model, if you wanted to take, say, a schedule from P6 or something along those lines, you'd be able to associate actual schedules with the model elements themselves. So Toric really enables teams to start doing 4D, 5D, 6D-type workflows behind the scenes. Of course, in those data flows associating data tables with other data tables, and then actually seeing the result in real time for your project teams who don't necessarily need to even know what Toric is or how it works behind the scenes. These dashboards can be passed out as many as you'd like.

So if you wanted to pass out 10,000 dashboards, it would be on the house. And of course, in addition to that, we have more or less all of the visualization tools that you could ever need, including things like if I wanted to do a timeline, or if I wanted to filter by responsible contractor, if I wanted to click into a particular cost impact. And of course, this is all interactive. If I want to go click into RFIs associated to a particular person, of course, the dashboard updates accordingly with where I'm clicking.

If I click out, submittals, observations, safety, anything that you want. So long as the data exists, we can build out the visualizations for it. We can pipeline it. We can write it from source to source, so long as the APIs accommodate, whatever it is you're hoping to achieve.

So with that being said here, we'll end it with a couple of customer use cases, the first of which is going to be Gamuda. So Gamuda is a really good use case in that Gamuda is multifaceted in what they do with Toric. Their use case is also pretty typical for what we're seeing general contractors asking of Toric and accomplishing for their project teams.

So in particular, Gamuda was asking for real-time data. That was their biggest hangup. They had more of a classical data build out, or data pipeline with hard coded transformations writing into what eventually is Google BigQuery.

They actually built out a data team for a specific data pipeline, an entire team for a data pipeline that they were hoping to achieve. And they struggled ingesting from all of these construction-specific sources. Of course, that seems to be the impetus for most of our conversations, is that construction tools, being construction tools, are very specific tool sets. They're not available in most cases for integrating with a platform you might pick out of a Google search.

So in this case, what they were doing was plugging into P6, ACC, as well as SAP, and then utilizing our data pipelines to make a repeatable, scalable data pipeline that they could use from project to project and of course get that data in real time. In addition to Gamuda, Whiting-Turner was doing something very similar.

Of course, in this use case, the reason I chose this slide is that Whiting-Turner as you can see, is utilizing both BuildingConnected and PlanGrid So what they're doing is automating an ingestion from those sources, routing it through Toric, performing that data cleansing, and then writing it to an Azure data lake, and eventually Power BI.

You'll also notice that neither Gamuda nor Whiting-Turner are necessarily utilizing Toric's BI tools. It's never going to hurt our feelings if you'd like to route your data to a lake or warehouse and then utilize a Tableau or a Power BI on the opposite end. But if you did want to utilize our BI or analytics tools, you can look at somebody like Commodore.

Commodore is a mid-size GC out of Boston. They have a data team of one person. This is a really interesting use case, in that she's actually plugging into all of these sources, creating the data pipelines herself. She is writing to, eventually, a data lake, but she's also utilizing, or starting to utilize some of the actual visualization capacity within Toric in that she's performing-- you can see on the right-hand side their quantity takeoff from a model , simply by ingesting that model from Revit into Toric itself.

And then, of course, given the properties in the data table actually allowing her to perform calculations for, in this case, total steel tonnage, she's also performing some safety type reports, incident reports, and really doing a lot of really interesting configurable visualizations and dashboards that are really built off of, behind the scenes, the data pipelines that she's running. She's really almost connecting the visualizations with the pipelines, which is a really interesting use case in that the data is full circle.

So coming soon should also be mentioned, Toric GPT. So I think like everybody else in construction technology, everyone is thinking about AI. Everybody is thinking about a GPT model.

Toric is incredibly well positioned when it comes to what GPT can be for us and what it can be for our clients and our customers. In short we're using a large language model, building a large language model off of the data that customers are allowing us to access. We have some of the largest data sets in construction at our fingertips that we're training the model on.

And eventually, our intention is to allow folks to go in and ask the bot how many RFIs are overdue this week? How many RFIs are over two weeks? And as the model learns, what's going to start happening is that AI piece is really going to come into play.

If you just filter it down to maybe your data sets, it'll start to average out things how long is an RFI typically taking to get responded to? If it's taking two weeks, 2.5 weeks, it'll start to eventually flag that data that's maybe over 2.5 weeks and you'll start to be able to make informed decisions based off of trends within your particular projects, or, of course, if you'd like to see maybe globally what the construction data is looking like, how long it's taking people across the planet to respond to particular RFIs.

The eventual intention here is actually to be able to build visualizations from these GPT models, meaning a project manager being able to go in and say, hey, give me my forecast for the next three weeks based off of my estimated cost to complete on this particular maybe even previous project. Maybe we're getting into the final phases of a high school that we're building. It's a football stadium. We've built one before.

We want to correlate the two, or understand the two together. That's really the intention of Toric GPT, and again, we're well positioned for it in that we have access to some of the largest data sets specific to construction, and that we will remain specific to construction with this particular GPT model.

If you go to ChatGPT now and you ask it about your RFIs, it'll probably respond with something about hot dogs. This is going to actually, of course, be completely specific to construction built off of your data sets if you want it to be, or, of course, just utilizing global data sets to understand more about how your projects could be performing better, how you could increase your margins, whatever it might be that you're hoping to accomplish.

So at the end of the day, why Toric? Why do our customers use us? Of course, I'm not going to read this word for word, but at the end of the day, nine times faster for getting data through your near real time, or in some cases, real-time pipelines.

You're five times savings with data movement, just the ability to do this in no code, being able to plug directly into those sources, being able to run it in real time, being able to template these data pipelines so you can run them time and time again, eventually, maybe, even saving headcount when it to data engineering team.

Six times productivity, of course, again with all of those aforementioned points being able to make your teams more efficient, really optimize your data pipelines and your workflows so that they're not managing the APIs, they're not managing the integrations. Toric is doing that for them so that they're making sure they're spending their time where it needs to be spent, either building new data pipelines or making, again, data-related decisions, as well as 12 times the volume really enabling your team to plug into all of those construction-specific sources and otherwise being able to route or channel that data through Toric, clean it, cleanse it, normalize it, and send it to its eventual destination. Thank you.

Downloads

______
icon-svg-close-thick

Cookie preferences

Your privacy is important to us and so is an optimal experience. To help us customize information and build applications, we collect data about your use of this site.

May we collect and use your data?

Learn more about the Third Party Services we use and our Privacy Statement.

Strictly necessary – required for our site to work and to provide services to you

These cookies allow us to record your preferences or login information, respond to your requests or fulfill items in your shopping cart.

Improve your experience – allows us to show you what is relevant to you

These cookies enable us to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we use to deliver information and experiences tailored to you. If you do not allow these cookies, some or all of these services may not be available for you.

Customize your advertising – permits us to offer targeted advertising to you

These cookies collect data about you based on your activities and interests in order to show you relevant ads and to track effectiveness. By collecting this data, the ads you see will be more tailored to your interests. If you do not allow these cookies, you will experience less targeted advertising.

icon-svg-close-thick

THIRD PARTY SERVICES

Learn more about the Third-Party Services we use in each category, and how we use the data we collect from you online.

icon-svg-hide-thick

icon-svg-show-thick

Strictly necessary – required for our site to work and to provide services to you

Qualtrics
We use Qualtrics to let you give us feedback via surveys or online forms. You may be randomly selected to participate in a survey, or you can actively decide to give us feedback. We collect data to better understand what actions you took before filling out a survey. This helps us troubleshoot issues you may have experienced. Qualtrics Privacy Policy
Akamai mPulse
We use Akamai mPulse to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Akamai mPulse Privacy Policy
Digital River
We use Digital River to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Digital River Privacy Policy
Dynatrace
We use Dynatrace to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Dynatrace Privacy Policy
Khoros
We use Khoros to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Khoros Privacy Policy
Launch Darkly
We use Launch Darkly to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Launch Darkly Privacy Policy
New Relic
We use New Relic to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. New Relic Privacy Policy
Salesforce Live Agent
We use Salesforce Live Agent to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Salesforce Live Agent Privacy Policy
Wistia
We use Wistia to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Wistia Privacy Policy
Tealium
We use Tealium to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Tealium Privacy Policy
Upsellit
We use Upsellit to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Upsellit Privacy Policy
CJ Affiliates
We use CJ Affiliates to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. CJ Affiliates Privacy Policy
Commission Factory
We use Commission Factory to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Commission Factory Privacy Policy
Google Analytics (Strictly Necessary)
We use Google Analytics (Strictly Necessary) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Strictly Necessary) Privacy Policy
Typepad Stats
We use Typepad Stats to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. Typepad Stats Privacy Policy
Geo Targetly
We use Geo Targetly to direct website visitors to the most appropriate web page and/or serve tailored content based on their location. Geo Targetly uses the IP address of a website visitor to determine the approximate location of the visitor’s device. This helps ensure that the visitor views content in their (most likely) local language.Geo Targetly Privacy Policy
SpeedCurve
We use SpeedCurve to monitor and measure the performance of your website experience by measuring web page load times as well as the responsiveness of subsequent elements such as images, scripts, and text.SpeedCurve Privacy Policy
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Improve your experience – allows us to show you what is relevant to you

Google Optimize
We use Google Optimize to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Google Optimize Privacy Policy
ClickTale
We use ClickTale to better understand where you may encounter difficulties with our sites. We use session recording to help us see how you interact with our sites, including any elements on our pages. Your Personally Identifiable Information is masked and is not collected. ClickTale Privacy Policy
OneSignal
We use OneSignal to deploy digital advertising on sites supported by OneSignal. Ads are based on both OneSignal data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that OneSignal has collected from you. We use the data that we provide to OneSignal to better customize your digital advertising experience and present you with more relevant ads. OneSignal Privacy Policy
Optimizely
We use Optimizely to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Optimizely Privacy Policy
Amplitude
We use Amplitude to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Amplitude Privacy Policy
Snowplow
We use Snowplow to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Snowplow Privacy Policy
UserVoice
We use UserVoice to collect data about your behaviour on our sites. This may include pages you’ve visited. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our platform to provide the most relevant content. This allows us to enhance your overall user experience. UserVoice Privacy Policy
Clearbit
Clearbit allows real-time data enrichment to provide a personalized and relevant experience to our customers. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID.Clearbit Privacy Policy
YouTube
YouTube is a video sharing platform which allows users to view and share embedded videos on our websites. YouTube provides viewership metrics on video performance. YouTube Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

Customize your advertising – permits us to offer targeted advertising to you

Adobe Analytics
We use Adobe Analytics to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, and your Autodesk ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Adobe Analytics Privacy Policy
Google Analytics (Web Analytics)
We use Google Analytics (Web Analytics) to collect data about your behavior on our sites. This may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. We use this data to measure our site performance and evaluate the ease of your online experience, so we can enhance our features. We also use advanced analytics methods to optimize your experience with email, customer support, and sales. Google Analytics (Web Analytics) Privacy Policy
AdWords
We use AdWords to deploy digital advertising on sites supported by AdWords. Ads are based on both AdWords data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AdWords has collected from you. We use the data that we provide to AdWords to better customize your digital advertising experience and present you with more relevant ads. AdWords Privacy Policy
Marketo
We use Marketo to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. We may combine this data with data collected from other sources to offer you improved sales or customer service experiences, as well as more relevant content based on advanced analytics processing. Marketo Privacy Policy
Doubleclick
We use Doubleclick to deploy digital advertising on sites supported by Doubleclick. Ads are based on both Doubleclick data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Doubleclick has collected from you. We use the data that we provide to Doubleclick to better customize your digital advertising experience and present you with more relevant ads. Doubleclick Privacy Policy
HubSpot
We use HubSpot to send you more timely and relevant email content. To do this, we collect data about your online behavior and your interaction with the emails we send. Data collected may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, email open rates, links clicked, and others. HubSpot Privacy Policy
Twitter
We use Twitter to deploy digital advertising on sites supported by Twitter. Ads are based on both Twitter data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Twitter has collected from you. We use the data that we provide to Twitter to better customize your digital advertising experience and present you with more relevant ads. Twitter Privacy Policy
Facebook
We use Facebook to deploy digital advertising on sites supported by Facebook. Ads are based on both Facebook data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Facebook has collected from you. We use the data that we provide to Facebook to better customize your digital advertising experience and present you with more relevant ads. Facebook Privacy Policy
LinkedIn
We use LinkedIn to deploy digital advertising on sites supported by LinkedIn. Ads are based on both LinkedIn data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that LinkedIn has collected from you. We use the data that we provide to LinkedIn to better customize your digital advertising experience and present you with more relevant ads. LinkedIn Privacy Policy
Yahoo! Japan
We use Yahoo! Japan to deploy digital advertising on sites supported by Yahoo! Japan. Ads are based on both Yahoo! Japan data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Yahoo! Japan has collected from you. We use the data that we provide to Yahoo! Japan to better customize your digital advertising experience and present you with more relevant ads. Yahoo! Japan Privacy Policy
Naver
We use Naver to deploy digital advertising on sites supported by Naver. Ads are based on both Naver data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Naver has collected from you. We use the data that we provide to Naver to better customize your digital advertising experience and present you with more relevant ads. Naver Privacy Policy
Quantcast
We use Quantcast to deploy digital advertising on sites supported by Quantcast. Ads are based on both Quantcast data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Quantcast has collected from you. We use the data that we provide to Quantcast to better customize your digital advertising experience and present you with more relevant ads. Quantcast Privacy Policy
Call Tracking
We use Call Tracking to provide customized phone numbers for our campaigns. This gives you faster access to our agents and helps us more accurately evaluate our performance. We may collect data about your behavior on our sites based on the phone number provided. Call Tracking Privacy Policy
Wunderkind
We use Wunderkind to deploy digital advertising on sites supported by Wunderkind. Ads are based on both Wunderkind data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Wunderkind has collected from you. We use the data that we provide to Wunderkind to better customize your digital advertising experience and present you with more relevant ads. Wunderkind Privacy Policy
ADC Media
We use ADC Media to deploy digital advertising on sites supported by ADC Media. Ads are based on both ADC Media data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that ADC Media has collected from you. We use the data that we provide to ADC Media to better customize your digital advertising experience and present you with more relevant ads. ADC Media Privacy Policy
AgrantSEM
We use AgrantSEM to deploy digital advertising on sites supported by AgrantSEM. Ads are based on both AgrantSEM data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that AgrantSEM has collected from you. We use the data that we provide to AgrantSEM to better customize your digital advertising experience and present you with more relevant ads. AgrantSEM Privacy Policy
Bidtellect
We use Bidtellect to deploy digital advertising on sites supported by Bidtellect. Ads are based on both Bidtellect data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bidtellect has collected from you. We use the data that we provide to Bidtellect to better customize your digital advertising experience and present you with more relevant ads. Bidtellect Privacy Policy
Bing
We use Bing to deploy digital advertising on sites supported by Bing. Ads are based on both Bing data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Bing has collected from you. We use the data that we provide to Bing to better customize your digital advertising experience and present you with more relevant ads. Bing Privacy Policy
G2Crowd
We use G2Crowd to deploy digital advertising on sites supported by G2Crowd. Ads are based on both G2Crowd data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that G2Crowd has collected from you. We use the data that we provide to G2Crowd to better customize your digital advertising experience and present you with more relevant ads. G2Crowd Privacy Policy
NMPI Display
We use NMPI Display to deploy digital advertising on sites supported by NMPI Display. Ads are based on both NMPI Display data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that NMPI Display has collected from you. We use the data that we provide to NMPI Display to better customize your digital advertising experience and present you with more relevant ads. NMPI Display Privacy Policy
VK
We use VK to deploy digital advertising on sites supported by VK. Ads are based on both VK data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that VK has collected from you. We use the data that we provide to VK to better customize your digital advertising experience and present you with more relevant ads. VK Privacy Policy
Adobe Target
We use Adobe Target to test new features on our sites and customize your experience of these features. To do this, we collect behavioral data while you’re on our sites. This data may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, your IP address or device ID, your Autodesk ID, and others. You may experience a different version of our sites based on feature testing, or view personalized content based on your visitor attributes. Adobe Target Privacy Policy
Google Analytics (Advertising)
We use Google Analytics (Advertising) to deploy digital advertising on sites supported by Google Analytics (Advertising). Ads are based on both Google Analytics (Advertising) data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Google Analytics (Advertising) has collected from you. We use the data that we provide to Google Analytics (Advertising) to better customize your digital advertising experience and present you with more relevant ads. Google Analytics (Advertising) Privacy Policy
Trendkite
We use Trendkite to deploy digital advertising on sites supported by Trendkite. Ads are based on both Trendkite data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Trendkite has collected from you. We use the data that we provide to Trendkite to better customize your digital advertising experience and present you with more relevant ads. Trendkite Privacy Policy
Hotjar
We use Hotjar to deploy digital advertising on sites supported by Hotjar. Ads are based on both Hotjar data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Hotjar has collected from you. We use the data that we provide to Hotjar to better customize your digital advertising experience and present you with more relevant ads. Hotjar Privacy Policy
6 Sense
We use 6 Sense to deploy digital advertising on sites supported by 6 Sense. Ads are based on both 6 Sense data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that 6 Sense has collected from you. We use the data that we provide to 6 Sense to better customize your digital advertising experience and present you with more relevant ads. 6 Sense Privacy Policy
Terminus
We use Terminus to deploy digital advertising on sites supported by Terminus. Ads are based on both Terminus data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that Terminus has collected from you. We use the data that we provide to Terminus to better customize your digital advertising experience and present you with more relevant ads. Terminus Privacy Policy
StackAdapt
We use StackAdapt to deploy digital advertising on sites supported by StackAdapt. Ads are based on both StackAdapt data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that StackAdapt has collected from you. We use the data that we provide to StackAdapt to better customize your digital advertising experience and present you with more relevant ads. StackAdapt Privacy Policy
The Trade Desk
We use The Trade Desk to deploy digital advertising on sites supported by The Trade Desk. Ads are based on both The Trade Desk data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that The Trade Desk has collected from you. We use the data that we provide to The Trade Desk to better customize your digital advertising experience and present you with more relevant ads. The Trade Desk Privacy Policy
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

Are you sure you want a less customized experience?

We can access your data only if you select "yes" for the categories on the previous screen. This lets us tailor our marketing so that it's more relevant for you. You can change your settings at any time by visiting our privacy statement

Your experience. Your choice.

We care about your privacy. The data we collect helps us understand how you use our products, what information you might be interested in, and what we can improve to make your engagement with Autodesk more rewarding.

May we collect and use your data to tailor your experience?

Explore the benefits of a customized experience by managing your privacy settings for this site or visit our Privacy Statement to learn more about your options.