说明
主要学习内容
- Learn about various commercial low-code/no-code platforms.
- Learn about various open-source data-flow/visual programming tools.
- Learn how you can access design data in Autodesk Platform Services from these platforms and tools.
讲师
- Petr BrozPetr is a developer advocate at Autodesk. After joining the company in 2011 as a software developer, he contributed to a range of web-based platforms and applications such as 123D Online or Tinkercad. In 2018 he transitioned into the developer advocacy team where he has been helping customers create cutting-edge solutions using Autodesk Platform Services, with a primary focus on visualization and AR/VR.
- Jaime Rosales DuqueJaime Rosales is a Dynamic, accomplished Sr. Developer Advocate with Autodesk, highly regarded for 8+ years of progressive experience in Software Development, Customer Engagement, and Relationship Building for industry leaders. He's part of the team that helps partners/customers create new products and transition to the cloud with the use of Autodesk's new Platform - Forge. He joined Autodesk in 2011 through the acquisition of Horizontal Systems; the company that developed the cloud-based collaboration systems—now known as BIM 360 Glue (the Glue). He was responsible for developing all the add-ins for BIM 360 Glue, using the API's of various AEC desktop products. He is currently empowering customers with the use of Autodesk's Forge platform throughout the world, with hosted events such as Forge Accelerators, AEC Hackathons, VR & AR Hackathons. He has been recently involved in the development of AWS Quick Start to support Forge Applications.
PETR BROZ: Hello. Welcome to Autodesk University 2024. In today's talk, titled "Access Your Design Data from Low-Code/No-Code Platforms," we will explore and give you some ideas about the options you have in building automated workflows around your design data and custom integrations with external services, all that with little or no coding experience.
My name is Petr Broz. I'm a developer advocate at Autodesk, focused on Autodesk Platform Services. I joined Autodesk in 2011 as a software developer, worked on several different web-based products and services. And since 2018, I've been on the Developer Advocacy team helping customers build amazing solutions using APS Now, Jaime, you want to tell us a little bit about yourself as well?
JAIME ROSALES: Yeah, sure thing. And this is funny, though, because I also joined in 2011, the company through an acquisition, which was now, in some way, I think it's still alive, the product BIM 360 Glue. So I was part of the core engineering team for that group. And I have one more year ahead of you, but they're-- on the Developer Advocacy team, which I joined in 2014. Based out of New York City, developer advocate as well. And happy to be here today.
PETR BROZ: Awesome. Thank you. All right. As for our agenda for our talk today, we will start by quickly explaining what low-code/no-code is and review some of the existing platforms and solutions that are available in this space. Next, we spend a little bit of time reviewing some of the existing options you have in building these automated workflows and integrations around your design data. Those are actually ready-to-use solutions that you don't really need to prepare in any way. But the main part of this talk will be dedicated to the custom connection section, where we actually give you some examples of how you can prepare and build these connectors and custom flows using external tools, accessing your design data using Autodesk Platform Services.
Now, before we jump into the interesting topic, we need to take a quick stop here and introduce the obligatory legal slide, our safe harbor statement. So during today's talk, you will hear forward-looking statements. But these are in no way guarantees of product or platform roadmaps. So please do not make any purchase or strategic decisions based upon these statements. All right, Jaime, over to you.
JAIME ROSALES: So now let's get started with low-code/no-code. And when you hear this term, you're going to start thinking, what do they mean about low-code/no-code? So low-code/no-code lets you build applications. So using simple visual tools, think of something like drag and drop or creating data flow diagrams.
And the main idea with this is that there is no coding at all. So everything is done with visual tools. And the reason why this is great is that you can test ideas a little bit quicker, debug easily, and get apps built faster. Even if you're not a developer, you can create custom workflows and integrations using the platform, in this case, the Autodesk Platform Services.
Now, let's look at some examples that we have available for low-code/no-code apps for you guys to start using today. The first one, and one of like our favorites, will be Power Automate, so basically automating workflows with low code. So this platform helps you to automate workflows across apps and services, seamlessly integrating with tools like, let's say, Microsoft 365, Dynamics and more.
What can you do with this? You can automate repetitive tasks. You can trigger notifications through webhooks. You can sync data, all without the needing of the code. And with pre-built templates and drag and drop interface, it's easy to create custom workflows that suit your needs. And we will look a little bit more in detail throughout this presentation, which on awesome video that we will see later on.
The next one, Workato, so this platform makes it easy to automate business workflows and connect various apps, both cloud and also on-premise. So IT businesses, users alike, can use it. It comes with a pre-built automation recipe, which you can customize. And it supports also complex workflows and offers the enterprise-grade security, which is perfect for large scale automations.
Next one will be like a rapid app development, which is called OutSystems. This platform lets you develop and deploy apps quickly with minimal coding. It works for both web and mobile apps, combining visual design tools with full stack capabilities. It comes with a pre-built template, connector, and integrations to speed things up. And it's designed for scalability and offers enterprise-level security as well, and performance for even the most complex applications.
One of my favorites, Node-RED, so Node-RED is a low code programming tool that makes it easy to connect to hardware devices. So think about it like if you're designing for IoT in order to build like flexible for all kinds of automations, this is the tool for you. You can build flows using pre-built nodes that handle data, devices, and services. And it's perfect for rapid prototyping and creating complex workflows without deep coding knowledge. It is powerful, versatile, and great for both developers and non-developers.
And the next one, Postman Flows, but before I talk a bit about Postman Flow, we have to make a mention to Postman. So those of you in the room probably have heard of Postman before. It's a API development toolbox. It is powerful in order to let you build and test and document APIs. It is widely used by developers that allows you to send API requests and responses, automate testings with scripts, and collaborate on API development with shared spaces.
But now, Postman offers what it is called the Postman Flows version. So it's a visual way to work with these APIs. So Postman Flow makes API testing even easier. You can build API workflows with simply drag and drop interfaces. It helps you connect requests, pass data, and create complex logic without writing code.
It is perfect for automating tests or setting up complex API interactions in just a few clicks. With Postman Flow, you can go from a basic API testing to a full API automation and all through the visual interface. And we will be showing you a bit more about Postman Flow in the next couple of slides. Over to you, Petr Broz.
PETR BROZ: Thank you, Jaime. All right. So let's take a look at some of the existing options you have at your disposal in order to start exploring these workflow, building automated workflows and custom integrations, connecting your design data to external services.
The first example we want to mention here is ACC Connect. So ACC stands for Autodesk Construction Cloud. This is Autodesk's product built on top of our platform, on top of Autodesk Platform Services, which is really a collection of products to really manage your construction starting from design phase through architecture, construction, engineering, all the way to operations.
And ACC Connect lets you very easily integrate your projects and design data in ACC with external services. Let's play a quick marketing video to just to get an idea about what ACC Connect is all about.
[VIDEO PLAYBACK]
- With so many software programs used on a project, keeping them up to date can seem a full time job. Autodesk Construction Cloud Connect makes it easy for anyone to automate how information is exchanged. Let's take a look at a connection in action.
When issues arise on the job site, the field team records them in PlanGrid, including details such as the location, description, and photos. As soon as an item is created, Connect sees the update and adds a new row to the project schedule. The project manager reviews the list of tasks in the office and adds critical delivery data, like due date and cost impact. These changes are automatically shared back to PlanGrid and updated directly on the task for the assignee.
Connect integrates the advanced technology in Autodesk Construction Cloud with the most popular software applications that you use for document storage, analytics, customer management, and more. Custom integrations become as easy as drag and drop with no coding required. Connect, integration made easy.
[END PLAYBACK]
PETR BROZ: All right. So this is what ACC Connect is from the end user's perspective. Now, from a technical point of view ACC Connect is a collection of components for the platform that Jaime already mentioned, Workato, that you can use to build these automated flows that can be either manually triggered or automatically triggered by external events. You can have trigger, for example, you can have your workout or flow triggered whenever a file has been added to your project. And you can use these building blocks provided by ACC Connect in your Workato recipe to do things like get information about project data, about design data, generate reports, aggregate information from your projects into spreadsheets, and publish this information to other systems, for example.
Let's take a look at how one of these recipes and workflows might look like. In this case, we have two folders with PDF documents sitting in our SharePoint. And we want to synchronize this information with our ACC project.
Now, using ACC Connect components inside Workato, we can build simply a recipe that will make sure to traverse all the new folders and their content and synchronize this data with a project inside Autodesk Construction Cloud. We have a recipe here. And now you can see that the two folders with PDF documents are now available inside ACC as well.
Another example is our Data Exchange Connector. Now, data exchanges are a relatively new concept in Autodesk. These are already available in various Autodesk products, including the already mentioned Autodesk Construction Cloud.
A data exchange is basically a connector, a container that allows you to share just a subset of your design data only with the right people and making it accessible in potentially different applications. This is a feature of several of our products. And Data Exchange Connector can then be used in Power Automate or in your Power Automate flows to actually access the design elements and their properties through a very simple connection. So you can start extracting information from your design elements inside data exchanges and use them in your customized Power Automate flows.
Here, we have another demo to showcase this functionality. So in this case, we have a data exchange available in Autodesk Construction Cloud. And we have an existing Power Automate flow.
Now, if we look at the flow, this flow is triggered manually in this particular instance. And we use the Data Exchange Connector to get properties out of the elements included in our Data Exchange. And then we export this information into a CSV file. So here you can see our building block provided by Data Exchange Connector allows you to select the file that we want to extract information from. And then when we generate, collect the data in a CSV file, we publish the CSV file to SharePoint.
All right. So let's see, we have our SharePoint site ready. We can save our workflow and test it out.
So here, we will start a new run of our Power Automate flow. So this is not in progress. We can see in the history here that our workflow is currently running. Can see that the initial setup has been completed, and we're now waiting for the properties from our Data Exchange. And as soon as these are ready, we will populate this property information into this output CSV file.
Now, you can see our workflow has completed successfully. So now our CSV file should be available in SharePoint. We can take a look. Going back to our testing folder, we can now see the output CSV file being just added. And after inspecting the CSV file, we can see all the property information, design properties, that were extracted from that specific data exchange in Autodesk Construction Cloud.
All right. And final example of existing options you have available to start building these very interesting custom integrations and workflows is Tandem Connect. Autodesk Tandem is yet another offering-- product of Autodesk's, also built on top of Autodesk Platform Services that is used for digital twins and operations. And Tandem connect is an integration platform that allows you to connect your design information inside Tandem with other enterprise systems or building management systems or IoT data in a very nice, simple, elegant, visual way. And these can also be automated, these workflows.
Basically, Tandem Connect, again, the integration platform starts with a data pipeline. You have a very nice, simple visual interface for building your data pipelines where you connect building blocks provided by Tandem Connect library. These building blocks can be used to access information from different systems, including IBM Maximo, Dynamics 365, or using, let's say, protocols, standard protocols such as OPC or BACnet.
And then once your data pipeline is ready, you can deploy these either to the cloud. Or you can deploy these to the edge, closer to your devices and to your facility's. Or you can use a hybrid approach combining these two environments.
Now, once again, let's take a quick look at how Tandem Connect works. In this demo, we start by creating a scheduled data stream. This will be basically a means for us to trigger our flow automatically, let's say, every 30 minutes. We will not output any specific JSON data. This will be really just to send a signal every, in this case, 30 minutes to start our data pipeline.
As the next step, we will use a basic HTTP service block that we'll use to request information from an external weather service API. In this case, we use a URL of an external, open weather service. We provide authentication information. And we're good to go.
As the next step, we will add a little bit of code, in this case, just to make sure that the JSON data we get from the external weather service is processed into a shape that we can later use and consume within Tandem. As you can see, you still have the flexibility of write custom code if needed. In this case, again, we're just making sure that the data is structured in the right way for Tandem.
And finally, we're adding our Tandem Connector. And by finishing our data pipeline with the Tandem Connector, we're making sure that the weather information that we're collecting in this pipeline will be available to the Tandem product, the Tandem front end. And we'll show you an example of that in just a second.
All right. And with the pipeline ready, we can save it and deploy. And after that, we can switch over to Tandem itself, the Tandem UI, and start consuming this weather information into our digital twin. For example, visualize this information in form of heat maps or in other ways.
You can see we already are receiving temperature and humidity information from the weather service. And here, we're in Tandem. And we already see our external weather connection available implemented using our custom data pipeline in Tandem Connect. And now, we can start mapping the information coming from the custom pipeline into our digital twin.
All right. So with that, we can now move to the really interesting part of our talk today. And that is telling you about how you can use our platform, Autodesk Platform Services, to actually prepare these custom connectors so that you or your colleagues in your company can start building these automated flows or custom integrations, connecting your data, your design data and project data to other enterprise systems and services. Jaime, over to you.
JAIME ROSALES: Yes. OK. So let me hit the Play button here. Give me one second.
All right. So as I described before, we're going to be focusing into the use of Postman Flows. And I'm going to be pausing and playing for just a bit. On the first call that we are focusing right now, it's in order to do a send request to generate an access token with the platform.
So let me do a quick pause before this thing happens. So if we can see here, we have the start of our flow into an HTTP request, which is referencing to a collection that we have available, which in this case is the two-legged token. The two-legged token, as we know, it needs to have a couple of variables available in order for us to be able to send over, so parameters for that matter. Some of them being the API host, which is the base URL that we're pointing out to, the client secret, the client ID, and at the same time some scopes. That's one of the reasons that we added this string to the left side of your screen that has a couple of scopes that allow you to do data reading and also bucket reading for that matter.
We set up an environment, in this case, a test environment, that has information about the client ID and the client secret that we have provided. If you have used Postman in the past, you know that it's a simpler way to not have to hard code every single time the credentials whenever you're using this in order to just have a quick reference to it. And then after that, if the call is successful, we're going to have an output into a JSON format of the token.
So let's go ahead and do that. So we hit Play. And now, we have an access token.
So you know if the body of the access token contains the value of the access token and the expiration time that we're going to be using in our subsequent calls in order to do, let's say, an extraction of data from the buckets, an extraction of data of objects that live within the bucket, and so on. And since each one of those calls is going to need the value of the token, our simpler way will be to actually create some sort of a variable.
At this time, we're referencing to another send request, which is going to be the bucket details. But as I mentioned before, let's assign that token value into a variable to be simpler at the time of passing it on to the next calls that we're going to be using in this flow. So if this call is successful, we're going to be obtaining the value of the buckets that live within that account. In this case, we have this bucket reference in here, which we're going to be using as a bucket key, to later on do a next call, which is going to look into the objects that live within that specific bucket.
So for this, we will need to pass in the value of the bucket key and, again, our variable that we have defined before, which in this case is the token. Once this is successful, we're going to get as a result, all the objects that live within that bucket. So let's wait for the flow to finish.
So now, each one of those objects have information about, let's say, the name of the object, the location where it gets stored. And another thing will be also the size of those objects. So in Postman Flow, we also have the capability of doing evaluation of some of those results.
And this is what we're going to be doing next. We're going to be evaluating the values that we obtained from the details of all the objects that live within that bucket. And we're going to analyze specifically the size. And I know we said low code, but also no code. And in this case, we're going to do a little bit of code.
And even if you cannot do coding to this level, there is plenty of helpers now like Copilot, ChatGPT, that can help you out to put together a simple function to traverse all the values that live within the object details-- and let me pause this-- in order to get a list of all those object sizes available for you. And then later on, because seeing sizes is going to be one thing, but what about if we do a visual representation? So one of the options that we can select as an output, in this case, as you see, it's a bar chart.
Now I was not able to catch that pause before because I wanted to do like some sort of drum roll. But you get the idea. It's basically, we go through the flow from start to finish as getting us a resulting and a graphical representation of all the object sizes of the objects that we have within our specific bucket. Now, over to you, Mr. Broz.
PETR BROZ: Thank you. All right. Now, the other option we want to explain here is our automate. Again, this is a very popular solution, a low-code/no-code system or platform provided-- built by Microsoft.
Now, when thinking about Power Automate, you can think of your options in integrating with this platform in two layers. At the lower level, you have these basically reliance on built-in blocks that are part of the platform, Power Automate, by default. You can use built-in connectors, actions, and triggers.
These connectors are really your connections or collections of actions and triggers. Triggers are building blocks that let you start your flows in reaction to certain events. And actions are then the actual processes you want to execute in reaction to these triggers. That's the first example, we will take a look at.
Now oftentimes when you're building Power Automate flows or really automated flows in other systems like Workato, you will oftentimes want to trigger these based on an external event. And this is something that our platform, Autodesk Platform Services, can help with because one of our services, the Webhooks API, can be configured to call and start one of your flows in reaction to certain events. For example, when a file is added to a project or when a new version of a design is available.
Now, in order to be able to set up webhooks in our platform, you do need to have a little bit of experience with REST APIs. However, we do have a little utility that makes the management of webhooks much easier. This is an extension for Visual Studio code that you can find more information about online, or you can definitely reach out to us, to the Developer Advocacy team, if this is something you're interested in. And we'll show you in a demo, again, how this Visual Studio code extension can be used to manage your webhooks.
All right. Let's take a look at another example. So in this case, we will create a new Power Automate flow. We will skip any special settings, starting with an empty flow. So we start with a trigger.
In this case, we will look for a built-in trigger called when an HTTP request is received. So this flow will be triggered when somebody calls certain URL. Here, we specify that anybody can start this flow. And we add an action, which in this case will be very simple. We will just want to send an email when certain event happens.
So we find another built in block called, send an email. We specify the email address that the email should be sent to. And we specify the subject and body of the email. For now, we're using a very generic content. We'll improve this later. And this is a extremely simple flow.
Now, when we save the flow, the trigger will actually generate the URL that we can call to start this process. So we have our HTTP URL. And we will want to set it up so that somebody calls this URL when certain event happens in the Autodesk Platform Services ecosystem.
And for this, we go to our VS Code extension. And here, we have a list of data management webhooks that we can create events for-- or we can create records for. So in this case, what we say is whenever a version, new version, of a design is added to our project, we will create a new webhook. We create a webhook. And we will want this webhook to call the URL provided by Power Automate to start that flow.
We do that by right-clicking here in Visual Studio code and saying Create new webhook. We provide the URL that should be called when this event happens. And one more thing we need to do is provide a scope for our webhooks. In this case, when we're listening for the event of new design version being added, we need to specify the ID of the folder where we want to listen for these or observe these changes.
Now, to get the folder ID, there's different ways. For now, we're just going to grab folder ID from the address bar here in ACC. Just one tiny thing we need to do, the ID is URL encoded. So we need to URL decode that string. So for now, we just use an external website to turn the original URL string into URL decoded string. So here on the bottom, we see the actual folder ID that we can then use for the new webhook that we are creating.
So with this, we can go back to our Visual Studio code extension, add the folder ID, and create our webhook. And from now on, whenever a new file is added to this folder, this webhook will make sure to call the URL that we provided. So let's give it a try.
We can save our flow. Switch over to Autodesk Construction Cloud. And upload a new file to our Power Automate folder.
OK, now our Revit file is being uploaded to the Power Automate folder in ACC. And now, it's being processed for viewing. And we can go back to Power Automate, check out the history of flows. Now we see that one of them has just completed successfully. And if I check my email inbox, I see that there is an email saying, file has been added. That's been sent by the Power Automate Flow.
Now let's say we want to improve this flow a little bit. And we want to be more specific when sending that email. We want to let the receiver know which file or the name of the file that has been uploaded.
One way to do that is we can take a look at this existing record of the flow that just completed. And we can take a look at the JSON body that was sent to us by the webhook from Autodesk Platform Services. We can capture the full body, the full JSON, which includes all sorts of information about what triggered this event-- the name of the project, where this event happened, the ID of the folder, the name of the file, the author, the person, the user ID that actually-- the ID of the user that uploaded the file, all sorts of useful pieces of information.
And what we can do is we can take a copy of this entire JSON and go back to the Edit mode, edit our flow, go back to the HTTP trigger, and provide this JSON example to Power Automate. And what will happen is simply by providing this example JSON, Power Automate will now try and parse any additional webhook calls coming through our flow so that we can later in our actions, in our visual pipeline, we can actually access and cherry pick that specific information, such as name of the file. So let's try and do that now.
So in our send an email action, we can actually modify the subjects to say, I want to include the actual name of the file that the webhook is informing us about. Let's see. And we can do the same in the body of the email. So we can, again, use this Lightning icon to include a dynamic variable, which will be cherry picked from the webhook payload that actually triggered this flow. So let's try and save this flow.
And let's try uploading one more file to our project. There we go. So another Revit file has been added to our folder here in ACC and is being processed. And I'm going to switch back to Power Automate. We see that another flow has just completed successfully. And when we check our email, we see that there is another email coming to my inbox with the actual name of the file.
All right, now, let's try and take this workflow even further. Let's say that we want to actually store and log this information about uploaded files somewhere and provide some sort of a report on this activity. What we're going to do is we're going to head over to Power BI. And we will create something called a streaming data set.
Streaming data set is basically a dynamic database that we can push data into during runtime. And then later, we can process and display this information in a Power BI report. So we start by specifying the type of data we expect in this streaming data set. So we will expect a name of a file, date of the upload, and the ID of the user who uploaded this file.
OK, so we have our streaming data set ready. It is empty for now. But that will change soon.
Now what we can do is we can create a simple Power BI report based on this data set. And you can see that the three fields that we specified earlier will be available to us when building this part of our report. So again, that will be the ID of the user who created the file, the date of the upload, and the name of the uploaded design.
And in our report, we're adding a simple table and a couple of charts. We will have a bar chart indicating number of files uploaded by on different dates. And we also add a donut chart, visualizing or representing the number of files uploaded by individual users.
All right. Our Power BI report is ready. Now we just want to populate our streaming data sets with some data. So let's save our report and head back to Power Automate.
Now, we will modify our flow. We will add a parallel action next to the email sending action. We search for another built-in block that is called "add rows to a data set." And again, we already know how to cherry pick the information from the webhook that triggered this flow. So we know that we can select a specific streaming data sets in Power BI. And we can then specify what kind of data we want to send as an additional record in this data set.
So we know we need to specify name of the file, date, and user ID, right. And again, we're using the Lightning icon to find the-- cherry pick the name of the file and the upload date and the user from the JSON payload of the webhook call that started this flow.
The dynamic content, we search for creator, which is the user ID of the person that uploaded the new file to ACC. OK? Our flow has been updated. We can go back to ACC and upload yet another file, yet another design.
Let's try this architectural drawing. OK. And after the file has been uploaded, we can refresh our Power BI report. We can now see one record indicating that somebody uploaded a DWG file to our ACC project.
Now, let's try one more file. We'll pick another drawing and upload it to the same folder in ACC. And once again, after refreshing our Power BI report, we can see that there is now two records of files, two logs of files being uploaded to our ACC project.
All right. The second option in building your custom connections for Power Automate, the higher layer on top of the option we just discussed, is building your own custom connector. This is quite easy to do using something called OpenAPI. Something I'll explain in just a second.
So basically, custom connectors for Power Automate are, again, collections of actions and triggers. You can specify one or more triggers that can then be used by people building their own Power Automate flows to start the flow in reaction to certain events. Again, that could be events such as a new file being added to my ACC project, as we just saw.
But our platform, Autodesk Platform Services, provide other types of events that you can react to as well, such as whenever a processing of a file, a design, extracting information, metadata, 3D views, 2D drawings, whenever a processing completes, that's another type of event that you could react to using a custom trigger. And then actions are, again, building blocks that the users of your custom connector can use to actually execute certain actions. That could be things like creating new file, create a new issue inside an ACC project, for example.
Now, one caveat for using custom connectors or developing custom connectors is that these currently in Power Automate support only three-legged authentication, which means that you will be able to do operations that require sort of a user impersonation in your access token, but you won't be able to use APIs in our platform that use the so-called two-legged OAuth, such as for example, creating new buckets in a data management service or uploading files to that bucket. You will have access to actions that can be executed by a specific user of the system.
All right. Now, I mentioned OpenAPI specs. What those mean-- again, when you are developing your custom connector, you have a couple of different options. You can either start from scratch and start creating the actions and the triggers manually in this form-based way, as you can see on the screenshot. So you can add a new action. For the action, you can specify what sort of URL endpoint should be called when this action is executed. You can specify parameters.
And similarly for triggers, you can say, OK, if there is a trigger, let's say on file added to a folder, and you can manually using a form interface here, specify what kind of webhook you want to configure when a user of your custom connector adds this trigger to that flow. So if you create a trigger for when a design file is uploaded and your user adds this trigger to their flow, at that point, your configuration here will make sure that under the hood, the Power Automate creates a new webhook in Autodesk Platform Services that will call certain URL when a new file is uploaded and when this event is detected within our system.
So you can either build the custom connector from scratch. Or you can use an OpenAPI spec. So OpenAPI is really a standard for describing REST API interfaces and web services. Based on this information, this is typically a JSON or YAML file where you describe these are my endpoints that can be called. These are the payloads that you can send with these requests. These are the expected responses from these calls. And they are supported with different HTTP methods that get put post. And these are my webhooks as well.
And if you do have your services defined using OpenAPI specs, you can actually import the specification file into Power Automate. And Power Automate will prepare and pre-generate actions and triggers for you based on this spec. And then you can always come back here and tweak individual actions or triggers if needed. But it's a really nice, convenient way of generating most of the logic for your custom connector automatically from a standard API specification that you may already have available at your disposal.
All right. Let's take a look at our final demo. So in this case, in Power Automate, we create a new custom connector. So in the top right corner, we say, New custom connector. And in this case, we say import from Open API URL.
So instead of creating the connector from scratch, we instead pointed to a GitHub repo that is actually available to all of you now. It's public. And we point Power Automate to a YAML file that describes some sets of our APIs in the Autodesk Platform Services, specifically describes webhooks and ACC issues that can be used to actually read or write issues managed in Autodesk Construction Cloud.
By doing this, we've imported our YAML spec, specification. And now, we just need to specify some additional details for authentication. So whenever somebody, a user, uses our custom connector to start building their flows, they will need to log in with their Autodesk credentials. And the token that this operation generates will then be used by Power Automate when actually calling the individual actions.
So we specify our application's client ID and client secret. So these, I'm grabbing these from the developer portal for APS. The client credentials are sort of like your application's username and password. So that this custom connector in Power Automate will then communicate with Autodesk Platform Services on behalf of this specific application and on behalf of the user that who has logged into Power Automate.
All right, we create the connector. And we can jump into the third section, which is the definition part. This is where you will see some of the pre-generated actions and triggers created automatically from the Open API specification. It's taking a while. So let me just skip ahead.
Now, when we create the connector, one more thing we need to do here is for the three-legged OAuth workflow, we'll also need to specify a callback URL, which means that when somebody uses our connector in Power Automate and they want to log in, they will be redirected to the Autodesk website where they can log in, provide a consent with the application having access to their data. And after clicking the Consent button, they will need to be redirected back somewhere. And that somewhere is the URL that Power Automate creates for us right now after saving the connector.
So we can scroll all the way down. We see our redirect URL. And this is the URL we will add to our developer portal so that our Autodesk Platform Services application is actually allowed to call this redirection. There we go. That is the authentication security part of our connector.
And now, we can see the definition. Here, we see a bunch of actions and triggers that have been automatically created from the specific OpenAPI spec. In this case, we have two actions one called GetIssueTypes and another action called CreateIssue.
What that action will do under the hood is send a REST call to our platform, to Autodesk Platform Services, to create a new issue in your project if you want to. And we also have one trigger autogenerated for us called, OnVersionAdded. So this trigger can be then used by users in their custom flows to trigger the flow whenever a new file has been added to an ACC project.
All right. We have our trigger. We can close the connector. If we reload this page, our custom connector should appear here shortly.
All right. We see our MyApsTest connector being available. So let's go back to flows creating new flow one more time. And this time, instead of using the built-in components, we will now use our MyApsTest actions and triggers.
So you can see, as soon as I click Add a trigger, I have a couple of options. But now, I'm actually filtering for MyApsTest connector. So I see there is a building block called, when a new version of a design is uploaded. So this trigger is coming from my custom connector. OK. I'll click on it. So this will be the starting point of my flow.
Now, since this is a custom connector, I need to log in. So you can see that Power Automate actually asked me to log in with my Autodesk credentials here. Excuse me. Accidentally jumped out.
All right, so we sign in. This is the two-legged-- sorry, the three-legged authentication workflow, where we're logging in with our OAuth credentials. And after being redirected back here, to Power Automate, Power Automate now has the access token that it needs to actually call different actions.
Now, one more time, we need the folder where we want to listen for the new files being uploaded. Once again, we are using an external website to URL decode the folder ID. And we saved this folder ID in our trigger in Power Automate. So whenever a file is uploaded to this folder, this trigger-- or this flow will be automatically triggered.
Next step we do, we create-- we initialize a new variable. We call it filename. And inside this variable, we store the actual name of the uploaded file that we once again cherry pick from the JSON payload of the webhook that started this flow.
The next action, we will add a condition. So Power Automate, just like Workato and other low-code/no-code system, they have their control flow statements for while loops, if conditions, and switches, and things like that. So we're adding a condition saying if file name starts with the string final dash.
Now, if that condition is evaluated to be false, we will want to react in certain way. Here what we'll do is we say create a new issue. So let's say we want to check that all the files uploaded to this folder always start with final dash. And if they don't, we want to react in certain way.
Now, let's say we want to create an issue. So we're using another custom block from our connector. In this case, Create new issue. And after adding the Create new issue block to our flow, on the left side, we see that there is a bunch of parameters that we need to specify for this action. First of all, the important one is project ID. So what is the ID of the project where we want to create this new issue?
We grab-- we cherry pick the project ID also from the webhook because, as I mentioned earlier, the JSON payload of the webhook contains lots of important pieces of information, including the ID of the user who uploaded the file, the project where the file was uploaded, file name, and things like that. So here, we're actually piggybacking off of that. And we're grabbing the project ID from the webhook payload and using that as to specify the project where we want the new issue to be created in.
All right. Then we can expand these advanced parameters. There is a couple more options you can specify. In this case, we will want to define the title of the issue. We'll just say incorrect file name.
We want to specify the description of the issue. So we say the dynamic variable file name does not start with final dash, which is what we expect. We can specify the subtype of the issue. And again, there's different ways to choose these.
For now, what we do is we can head back to ACC, get the list of different types of issues, and grab the ID of whatever issue we think is most appropriate for this kind of problem that we want to report back to the ACC project. So I'm going to select this general issue. And I can steal the issue ID from the address bar one more time. I'm going to specify it here. And let's say we also want to specify the status of the issue to open.
All right. And that could be another example of a Power Automate flow. So let's save it and test it out. We go back to our Overview page in Power Automate. We can head back to ACC and upload a new file to our Power Automate folder. And this time we will intentionally not call it with-- not prefix it with final dash, just to trigger the error. So let's say we upload an image. I'm not why I chose this one specifically. That's a nice one.
So let's say we go back to typical common content, instead of pictures from vacation. And we upload a Revit design. But it does not start. It's not prefixed with final dash. So let's see what happens.
Our file is uploaded. But if we take a look at some existing flows, we see that one flow has just completed successfully. We see that we actually hit the false branch of our condition because this file was not prefixed with final dash. So now, what we can do is we can go back to ACC, look at the list of issues. And, voila, we have a new issue automatically created, informing us that the newly uploaded file rstbasicsampleproject.rvt, is not prefixed with the required final dash prefix.
And that is a wrap for this is our final demo for today. Before I hand it over to Jaime, I just want to quickly mention there is other options as well. Today, we focused on Postman Flows and Power Automate because these are very, very popular among our customers. But there's more.
For example, if you maybe are already familiar with OutSystems, which is yet another low code platform, or Node-RED, the system, the low-code/no-code visual-based-- visual programming style environment that Jaime also presented, if, well, either of these are maybe more interesting for your use cases, we have blog posts on these topics as well. So definitely check these out. You can learn more about how you can build your custom components for OutSystems, for example, so that your customers, your users or your colleagues can start building flows in OutSystems connecting to the Autodesk Platform Services and connecting to their design data.
And with that, that is a wrap. And over to you, Jaime.
JAIME ROSALES: All right. Well, thank you, Petr. I want to just do a quick highlight about some upcoming events that we have that you guys might be able to join us and learn more about how do we use the platform. So coming up in November from the 18th to the 21st, we're going to have a free online training using the basics of the platform. So this includes a couple of bootcamps going through how to start your first APS application using the viewer; how to do your first application extracting data from ACC hubs, folders, projects, items, versions; some advanced uses of the viewer with interaction with extensions and dashboards, and then also some design automation bootcamp that allows you to do automated tasks on different engines like Revit, AutoCAD Inventor, and so on.
And the other thing is that also we host accelerators throughout the year. We have an upcoming one in the US, which is going to be in our office in Atlanta from December 9 to the 13th. We still have a few spots available.
So definitely come join us for the week. Bring an idea in mind to work on a prototype. We can help you out building this prototype. So definitely it's a good opportunity for you guys to interact with the developer advocates and learn more about the platform in it.
So with that, I want to say thank you. Thank you to you guys listening on online. Thank you, Petr, for the amazing presentation. Those workflows with Power Automate definitely are quite useful. And I can see lots of possibilities of stuff that you guys can build with this. So, yeah. And with that, we're going to give it a wrap. And thanks. See you again.