Description
Key Learnings
- Learn how build a live connection between Alias and VRED.
- Learn how ShotGrid can connect creative and modeling workflows.
- Discover the customization capabilities of Alias, VRED, and ShotGrid.
- Learn about the art of the possible with virtual 3D modeling workflows.
Speaker
LIONEL GRAF: Hi, my name is Lionel Graf. I'm an Implementation Consultant at Autodesk. My role at Autodesk consulting is to support our customers in the adoption and the customization of our technologies.
Some of a Safe Harbor statement first-- so what I will share with you today is an example of how Autodesk consulting can help you push the boundaries of Autodesk products and extend functionalities beyond what is available out of the box. So it does not reflect any plan for future development. So let's review what we will be talking about today.
We will first have a look in the customization capabilities of VRED and how we can take advantage of the VRED Python API to tailor VR experiences. The prototype I will be showing you today is relying on ShotGrid to streamline collaboration between designers and modelers. So we will explore how we can use ShotGrid to connect creative teams. And finally, I will share with you, an attempt to push the boundaries and see how we can use Dynamo to build a live connection between Alias and VRED.
Let's talk first about customizing the user experience. So both VRED and ShotGrid are proposing a Python API, which can be leveraged to design customized workflow. The first obvious use case is to save time in doing repetitive tasks by making simple automation. But we can also use the API to make specific tools, which can even appear as additional tools in the product UI and become part of your own distribution of the product.
This is exactly what we did for this prototype. Python is an interpreted programming language that is very popular for plugin development. There are a lot of other TGI tools that offer a Python interface. And that's not just a coincidence.
Python is said to be a very expressive language and easy to learn. Compared to programming languages like C++, C#, or Java, Python does not have to be compiled. So you can just write some code in VRED, hit Run, and it immediately executes. This is very convenient for customization and plugin development.
VRED uses its own integrated Python-3 interpreter. This way, you can directly access the VRED API from VRED's Python environment. You can find in the documentation tutorials and references to VRED API methods, as well as examples, which helps a lot to understand how to get started with basic concepts.
VRED offers many ways to script with Python. There are multiple interfaces, where you can place Python code to affect the behavior of your office scene or VRED in general. But not every interface is suited for the same task.
The most basic way to execute Python command in VRED is the terminal. The terminal both acts as input for Python commands and output for print messages that are generated by VRED, other scripts, or your own code. When you open the terminal after starting VRED, you will see a bunch of messages that are basically initialization messages from VRED.
Every print message you, or any other scripts generates, show up in the terminal. So it is the only place where you can directly see whether there is an error while running your script. Or you can look at your log messages.
Then, we have the script editor. It is comparable to a Python file that is directly attached to a written. Here, you can define globally available variables, functions, and also classes. Any Python code you write here is also available in the variant sets, the terminal, or the web menu. So it is globally available.
Talking about variant sets, they can be also used to trigger scripts. All variant sets have a script tab, where you can write Python script directly. We will mostly use here, scripts that do one particular thing, like call multiple functions at the same time, triggering animations together with some additional logic, or make scene-specific actions.
The Python script you write in the variant set is locally scoped. That means that variables, functions, and classes are only available in this particular variant set. So it is adapted to small local scripts. If you need to add a Python script that contains hundreds of line of code, you may want to consider using a plugin, which gets loaded at VRED start.
It is particularly adaptive if you want to offer specialized tools that should be available to all of your colleagues. Script plugins are a special kind of Python script. They are independent from VRED and must be installed in a special plugin directory in order to be used. You can access script plugins by opening the script entry in the menu bar in VRED, where every installed plugin is listed. Script plugins usually provide a user interface that is integrated into VRED and feels like it is just part of VRED itself. That's because PySide widgets use the VRED window styles by default.
Our tool here is targeting VR experience. What we wanted was to implement a new VR functionality and access it while being in VR. VRED is proposing a customizable VR interface to interact with the scene. Here is the VR menu.
The first level of customization offered for the VR menu is the set of tools available through the script menu. There, you can choose what tools you want to be displayed in the VR menu. This is an easy way to ensure that the VR interface is easy to understand for occasional VR users, for example, by displaying the only necessary tools. And using VRED's Python API, we can also customize the VR menu and tailor the user experience.
As the VR menu is based on QT, we can even make complex UIs with QT widgets. For this prototype, we have been developing a brand new tool for VR. The so-called taping tool is proposing a multi-page user interface, where one can access drawing tools and the layer system, the typesets, to organize drawn curves.
Let's have now a deeper look into the VR typing tool. So the tool took inspiration in the automotive clay process. When designing a car, the clay model is a unique way to understand the volume of what is being designed. Having the ability to experience the object at scale and under natural light will probably never be completely replaced by computer-aided tools, because it is a very unique tool to collaborate on the design, making sure everyone has a common understanding of the shape and the volume.
However, for various reasons, clay models sometimes become rare in the early design process. And decisions tend to be made in digital. In such cases, the communication between designers and digital modelers only happens through sketches, images, and movies, leaving the opportunity to interpretation, and which can sometimes lead to more iteration in order to reach the design intent.
In the late years, VR reached a level of maturity, both from a hardware and software integration point of view, which led to a point where it is easy and reliable enough to consider it as part of the digital process. But we still are missing something. Picking a roll of tape and stick it on the clay model to understand the section emphasize a character line or define cutting lines is still not possible in VR. So what if we could replicate this process in a digital world?
Let's see how it looks like. So to streamline collaboration, we will be using ShotGrid as the backbone of this process. Everything will start with putting together some references. For example, we will bring in Alias, the wheelbase, the driver position, and some reference points that will guide the creation. An initial Alias file is published to ShotGrid to base our design on.
As a designer, I can start my query work in VRED. Thanks to ShotGrid, I easily find the base geometry I will work with loaded into my new scene and start a VR taping session from here. Again, they start ideating and draw some reference curves in the air, using simple tools like free drawing. Or if I want to be more precise, I can draw curves by placing the control points. Here, I'm drawing some character lines, which I would like the modeler to use as guides during the future 3D modeling process.
At any time, I can modify the control points and adjust my 3D curves in VR. Also, I can iterate on my work using tab sets. This is a sort of layer system, where curves are grouped together and which is allowing me to manage visibility and color to make alternatives, for example. Once I'm happy with the results, I can publish my work back to ShotGrid and hand over my design guidelines to a digital model.
Now, as a modeler, I can load the designer's input and start working on the 3D geometry from here. I obviously accelerated the modeling process here. But using the designer's tape as guidelines, I can build a 3D shape with confidence. And using the same publish-load mechanism, the designer and the modeler can iterate on the design, while keeping track of the model evolution.
Here for example, I'm placing 3D markups on top of the 3D model, which I will communicate to the modeler to guide future improvements on the design. And as we are using 3D curves as a common language, these can be brought in Alias to guide the modification. And from there, more iteration can happen.
So let's look into these tool features. We wanted to keep it simple and focus on three main tools-- two drawing modes and a tool to modify the existing curves.
The first drawing mode is the Freeform Sketch tool. It lets you draw in the air in a continuous stroke, but also to ensure a nice result. The curve is evaluated and simplified on the fly, giving a nice degree 5 NURBS curve as a result. It is pretty difficult to draw in a three-dimensional space and often gives weird results. So simplifying the curve will ensure that to remove the noise and prevent us from getting a shaky curve.
The second row mode is the Draw by Point tool. This one will place an inflection point on each click on the controller trigger and join them with a straight line. In fact, we are here placing the NURBS curve control points. When we press the controller trigger, two control points are placed next to each other, which gives this nice change of direction with a radius result. Pressing and holding the trigger will let us move the second control point apart to make a smoother curve.
The last drawing tool is the Curve Modifier. When active, this tool lets you select the curve to modify by touching it with the selection sphere and displays the curve control points. Once visible, the control points can be picked the same way and dragged around to modify the curve shape.
And for more control over the sketching, results we implemented also, a tape set concept. It is presented as a View tab in the VR menu and displays the tape sets in a similar way as layers. Each tape set is a collection of curves, on which we can control the visibility and the color. A tape set can also be duplicated or deleted. We also thought for future improvements, to make it possible to push tape sets directly to ShotGrid for faster collaboration.
Talking about ShotGrid, let's see how it can help to streamline the collaboration on taping sessions. So in the simple example here, assuming we start with some Alias reference or criteria, a modeler assigned to a modeling task publishes the initial package. Our designer, being assigned on a taping task, creates a new file for his session and pulls or load the Alias reference.
When the work is done, the designer publishes his work on his task and notifies the modeler there is a new update. The modeler can then pull the latest updates, apply changes, and publish the results back to ShotGrid, continuing the iteration cycle. This makes iterations much easier and faster. So using ShotGrid in such an environment streamlines data exchanges and improve collaboration.
But how? Well first, ShotGrid proposes desktop tool integration. It means that from Alias or VRED in our case, you can directly access the data that has been published by the team without requiring to know where the actual file has been saved or having to ask for it. All you need to do is to use ShotGrid loader and pull into your scene, the latest published file you need.
Also, with the scene breakdown, you can ask ShotGrid if there is a new version of the file reference in your scene and choose to update it or get back to an earlier version of the file. With the ShotGrid panel, you can see the project activity or comment on what has been shared with you. So you are always connected with your project collaborators. And using ShotGrid to manage your project pipeline also ensures that you keep track with data evolution and updates. Each time your work is published, a new version gets created, and one can easily navigate through the data history with the tools mentioned before.
The good thing with using ShotGrid as the backbone of this process is that everything is connected together, or should I say everyone. The data published to ShotGrid is controlled and automatically versioned every time something new gets published. This ensures that you can keep track on the data and always have the ability to go back in time and take a new creative direction if you want. People are connected to you. You can exchange comments through notes which are associated to the asset you are working on.
You can notify teammates there is a new update or share comments on what has been delivered. And even more important, as someone who needs to oversee the work being done, you can view the activity on a piece of work and understand what is going on. And one more thing-- using ShotGrid pipeline toolkit, we can design custom workflows and extend the functionalities of the content creation tools.
In our case, for example, we added a published plugin to automate the extraction of the tape curves generated in VRED. So the modeler can bring them directly into Alias. Or we can add menu items to ShotGrid context menu to launch a taping session directly from the file open application, and thus, set ShotGrid context automatically in VRED with all the needed scripts and custom tools.
There is one final thing I wanted to share with you. What if we could make it even more integrated and make it live connected with Alias? We pushed the prototype to a point where VR taping is reflected almost real time in Alias. To do so, we leverage Dynamo. We have been using the web request node in Dynamo to talk to VRED and query the tape curves control points. The return list of points is manipulated in Dynamo and used to rebuild the NURBS curves.
Then, we push the curve geometry back to Alias. Doing that periodically, every half a second, the tape curves are pushed and updated in Alias almost real time. For now, this is only working if Dynamo tool is active in Alias. So we can't really model at the same time. But we think this is an interesting step toward collaborative modeling.
So what has been shared with you today is an example of how Autodesk consulting can support you in extending the usage of Autodesk solution by building custom tools that fits your own particular workflow. If you would like to experience this tool yourself and help us continuing to develop this prototype, please contact us. Thanks for having watched this video. Have a very nice day.
Downloads
Tags
Product | |
Industries | |
Topics |