Description
Key Learnings
- Get an overview of extended reality technologies
- Discover which VR and AR features can be used
- Discover how they can be used in the automotive development process
- Learn workflows that will save time by increasing efficiency
Speaker
- Tillmann DorschI am an experienced 3d Artist and Consultant with an MBA in international Management, who joined Autodesk as a Product Owner for Visualization this year.
TILLMANN DORSCH: Hello, and welcome to my presentation about XR-based based decision making in Automotive. My name is Tillmann Dorsch, and I'm a product owner for visualization in the Autodesk Viewer team. I have a background in 3D visualization, and I was working over the past 10 years as a 3D artist, visualization expert, and consultant, mostly in automotive to create visuals, secure virtual processes, and manage car configurators.
First I will give you a brief description about what XR is and explain the different terms of virtual, augmented, and mixed reality. Later I will explain the benefits of XR in the automotive industry, both the general benefits of 3D visualization and the benefits of extended reality compared with conventional 3D visualization.
I will cover different use cases of XR in automotive and will explain how to evaluate in VR and MR can be done. I will give an example of virtual collaboration and explain the benefits of cloud based workflows.
So what is XR? XR is an umbrella term for virtual, augmented, and mixed reality. The X can be considered as a variable representing any current or future spatial computing technology. VR is fully immersive. It's all virtual, and there's no connection to the real world. An example can be an application where you run HMD-- it's a head mounted display-- or you're sitting on a chair while experience the virtual reality application.
In addition to industrial solutions, games are also a very popular example of how VR is used. Augmented reality is an overlay to the real world. It can be a smartphone display, where 3D image in front of a two dimensional card catalog pops out. The most famous example of an AR application is probably Pokemon Go, as you may all know, a game where you collect virtual Pokemon captured in real life.
Another example of augmented reality are head-up displays that are used for navigation in cars, in cockpits of planes, or are integrated into a helmet, another very popular example of augmented reality filters, as you can see on my video now. So there's this funny VR headset, goggles. You just play around if you have time. It's fun.
Mixed reality is the combination of the real and the virtual world. It can be a seating buck where you see a virtual car but feel the steering wheel. It can be also your own hands visible in VR in front of your HMD, such as the virtual XR# device offers or simply a virtual car next to a real car seen through a tablet.
That general benefits of 3D visualization are cost reduction, increased speed, and more flexibility. In virtual design validation process, you can reduce the amount of clay and cubing methods by using virtual prototypes instead. Clay models are required to secure the design process but are very expensive.
Cubing models are necessary to secure the engineering process but are even more expensive. Generally, you need less prototypes and hardware if you rely on 3D visualization. There is less need for real prototypes if you visualize your design and engineering with VRED. It's a time saver. You don't have to wait for the hardware to get built. You can decide based on your 3D visualization.
Design, engineer, and visualize. The data is already created with our 3D software anyways and already there. Just leverage it. Visualize all possible product variants in one single file, as well as different proposed design and engineering solutions during the development process.
You can simulate data front and rear lights of the exterior, the ambient lights in the interior, and complex light animations, such as welcome scenarios when unlocking a door, for example. Design can be compared in different environments, such as landscapes and interiors. It can be also lit in a photo studio scenario to explore marketing potentials. 3D visualization can be accessed by anyone via real time application or rendered images.
So now I want to give you a quick overview of what VRED offers. And Autodesk VRED covers the whole product lifecycle, from early design concepts to engineering, production, factory planning, and marketing activities, as you can see from this picture.
VRED delivers state of the art real time rendering, as well as full GI photorealistic offline renderings. It offers advanced interaction possibilities via Python and enables extended reality and collaboration tools out of the box integrated in the software that can be used by the user.
Now I want to talk about the benefits of XR compared to conventional 3D visualization. Thanks to 3D, dimensional presentation of the product in XR, more realism is as achieved. You can evaluate surfaces and gaps better and get a realistic idea of how the product will really look like. As a user, you are more involved and get a better emotional understanding of the product.
Thanks to fully immersive tracking technology, you can conduct realistic economic studies. Compared to traditional, CAVE, for example, XR is less expensive. It takes less space and can be used more flexible, such as on trade fairs, to gather feedback, or present new design ideas. Doing a virtual collaboration, you have improved decision making. You can combine the real world with the virtual and take advantage of both.
Now I come to the first example, evaluation in VR. The setup is a head mounted display connected to a high end PC with good graphics card, a simple dragging of the body position and head movements. It can be inside out tracking, as you know it for example, from the HP Reverb G2, or outside in tracking, such as the HTC Vive with lighthouses offers.
Optional are VR controllers to track your hand position and to present your hands in virtual reality. This leads to an even better immersion. Use cases are the evaluation of surfaces and materials. Plus you can visualize the light design of the car and view it in VR, something a clay model, for example, doesn't offer.
The second example is about evaluation in MR. Starting from MR in interior, the setup is an HMD, complex tracking of body, head movements, and hands, plus a seating buck. A seating buck is a frame to which the steering wheel and seat are attached. Additional 3D printed parts of the car, such as the dashboards, center consoles, and those can be mounted to the seating buck.
This provides a haptic representation of the surface in mixed reality so that the user can touch and feel the design, not just see it. Use cases are the evaluation of materials and surfaces, HMI concepts, ambient lighting, and ergonomic studies. For HMI design, you can test the [INAUDIBLE] programmed with HTML5 or QT designer directly in your VRED scene.
Event tracking enabled and the physical representation of your car, such as 3D printed surfaces of dashboard and doors, you can explore the whole interior, including display content. Also the ambient light and illuminated icons of buttons can be viewed in mixed reality. If there is a night scenario of your design prepared in VRED.
You can also calculate realistic light simulation if you use the new light making capabilities of VRED, as well as animated welcome scenarios. Test the accessibility of gearshift, sun shades, and storage trays, all virtually with realistic achievements applied. The setup of emulation in mixed reality looks similar with the difference that you don't need the seating buck.
The need of a physical representation depends on your use case, hence are less important in exterior, and the hand-tracking can be done by VR controllers. Use cases are the comparison of filter. With virtual, you can view and evaluate virtual faceless parts on a real existing car before the design changes go to production.
A virtual car can be fueled in a real environment, or you can evaluate virtual materials on a physical representation of the car, such as a clay model. With advanced hand tracking methods enabled by [INAUDIBLE] you can also view [INAUDIBLE] in VR.
Next I want to talk about ritual collaboration. What you need for a virtual collaboration is a high end PC or laptop, additionally, an HMD for joining the collaboration in VR, and of course, a stable internet connection. The benefits are easy access for any person. What's more, visualization experts can access a virtual collaboration with a prepared data set. Participants can access from anywhere, no matter where they are located.
You can collaborate with your team in the United States, Europe, and Asia and share your ideas. Several experts from different departments can collaborate simultaneously on the same data set. You can take a lot point of view and explore the product together. This enables a closer collaboration.
So finally, let's talk about cloud based workflows. A part of your device which are using a fast wireless internet connection, such as 5G mobile network, is the most important aspect of your setup to join the cloud. The main benefit is that there is no need for expensive local hardware anymore. Users can access from multiple devices, such as tablets, mobiles, HMDs, or laptops.
With untethered devices, there is no need for annoying cables anymore. The content can be streamed directly to a wireless head mounted display, no [INAUDIBLE] of cables, no complex setups with external triggers and a special mounting on the ceiling to deal with the cables, no limit of space. The user don't get pulled back by the cable attached to his HMD on the head.
Cloud based workflows enables a broad participation. User can access from anywhere and are unable to participate from home offices. Maybe this is difficult. In some areas, such as Germany, we don't have the best wireless right now. But I'm pretty sure this will also change in the future. Your vision be shared with anyone, no matter if you are talking about customers, colleagues, or other stakeholders.
The possibilities of sharing your designs are endless. When the power can be spared on demand, if you want to evaluate your VRED scene with realistic reflections, lighting, and shading, you just need to unlock more [INAUDIBLE], and you can emulate your design in real time CTO waytracing. Or if you have a board presentation, you can render an offline animation overnight and present it the next day.
So now I come to my conclusion. To sum it up, XR is a broad field that involves different technologies and personas. Extended reality is connecting different departments and people across the whole product lifecycle. VRED as a software has been a pioneer in product visualization for over 20 years and will continue to develop state of the art solution for extended reality.
At Autodesk, we are working closely together with our customers and partners to develop the best tools for XR based decision making in automotive. Thank you very much for your attention, and I'm looking forward to answering your questions in the Q&A session afterwards.