AU Class
AU Class
class - AU

Moldflow Injection Molding Simulation: Research and Development update

Share this class

Description

This presentation will introduce recent and current research topics related to the Autodesk Moldflow injection molding simulation solvers, including the prediction of final part shape after molding, the behavior of the polymer melt during cavity filling, pack and cooling and the prediction of visual defects which may results in the molded part. In particular, improvements in the warp prediction accuracy for 3D model geometries by making use of measured shrinkage calibration data will be discussed, as well the elimination of model size restrictions for 3D warp analysis. Improvements in accuracy of Dual-Domain corner effects warpage will be demonstrated along with options for examining plastic part deformation after assembly.

Key Learnings

  • Access new and improved capabilities in the Moldflow Injection Molding simulations
  • Anticipate material characterization requirements
  • Visualize prediction of additional visual defects
  • Analyze larger, more complex model geometries

Speaker

  • Avatar for Franco Costa
    Franco Costa
    Dr. Franco Costa is Research Director for Moldflow injection molding simulation at Autodesk, Inc. With 29 years of experience with Moldflow simulation software, he has contributed to the technologies of 3D flow simulations, thermal analysis, crystallization analysis, structural analysis, final net-part shape prediction, and Multiphysics for the plastic injection molding simulation industry. Franco has moved through roles as a research engineer, Development Team leader, and manager, and he directs research projects for the Autodesk Moldflow group as well as acting as an internal reviewer and technology architect. Franco has presented at academic conferences in the field of polymer processing. He also acts as a referee on international journals and often presents overviews of Simulation Moldflow research-technology directions at Simulation Moldflow user meetings. Franco is based in the Autodesk Research and Development Center in Melbourne, Australia.
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
  • Chapters
  • descriptions off, selected
  • captions off, selected
      Transcript

      TILLMANN DORSCH: Hello. And welcome to my presentation about XR-based decision-making in automotive. My name is Tillmann Dorsch. And I'm a product owner for visualization in the Autodesk Revit team. I have a background in 3D visualization. And I was working over the past 10 years as a 3D artist, visualization expert, and consultant, mostly in automotive, to create visuals, secure virtual processes, and manage car configurators.

      First I will give you a brief description about what XR is and explain the different terms of virtual, augmented, and mixed reality. Later I will explain the benefits of XR in the automotive industry, both the general benefits of 3D visualization and the benefits of extended reality compared with conventional 3D visualization.

      I will cover different use cases of XR in automotive, and will explain how to evaluate in VR and MR can be done. I will give an example of virtual collaboration and explain the benefits of cloud-based workflows. So what is XR? XR is an umbrella term for virtual, augmented, and mixed reality. These can be considered as a variable representing any current or future spatial computing technology.

      VR is fully immersive. It's all virtual. And there's no connection to the real-world. An example can be an application where you can wear an HMD. It's a head-mounted display. Or you are sitting on a chair while experience the virtual reality application. In addition to industrial solutions, games are also a very popular example of how VR is used.

      Augmented reality is an overlay to the real world. It can be a smartphone display, where a 3D image in front of a two-dimensional card catalog pops out. The most famous example of an AR application is probably Pokemon Go, as you may all know, again, where you collect virtual Pokemons captured in real life.

      Another example of augmented reality are head-up displays that are used for navigation in cars, in cockpits of planes, or are integrated into a helmet. Another very popular example of augmented reality, filters, as you can see on my video now. So there's this funny AR headset, goggles. You just play around if you have time. It's fun.

      Mixed reality is the combination of the real and the virtual world. It can be a sitting buck, where you sit in a virtual car but feel the real steering wheel. It can be, also, your own vis-- hands visible in VR in front of your HMD, such as the-- where your XR-3 device offers. Or simply a virtual car next to a real car seen through a tablet.

      The general benefits of 3D visualization. Our cost reduction increased speed and more flexibility. In virtual design validation process, you can reduce the amount of clay and cubing models by using virtual prototypes instead. Clay models are required to secure the design process, but are very expensive. Cubing models are necessary to secure the engineering process, but are even more expensive.

      Generally, you need less prototypes and hardware if you rely on 3D visualization. There is less need for real prototypes if you visualize your design and engineering with VRED. It's a time saver. You don't have to wait for the hardware to get built. You can decide based on your 3D visualization.

      Design, engineer, and visualize. The data is already created with 3D software anyways and already there. Just leverage it. Visualize all possible product variants in one single file, as well as different proposed design and engineering solutions during the development process. You can simply-- you can simulate the front and rear lights of the exterior, the ambient lights in the interior, and complex light animations, such as welcome scenarios when unlocking a door, for example.

      Design can be compared in different environments, such as landscapes and interiors. It can be also lit in a photo studio scenario to explore marketing potentials. 3D visualization can access-- can be received by anyone via real-time application or rendered images.

      So now I want to give you a quick overview of what VRED Autodesk VRED covers the whole product lifecycle from early design concepts to engineering, production, factory planning, and marketing activities, as you can see from this picture.

      VRED delivers state-of-the-art real-time rendering, as well as full GI photorealistic offline renderings. It offers advanced interaction possibilities via Python and enables extended reality and collaboration tools out-of-the-box, integrated in the software that can be used by the user.

      Now I want to talk about the benefits of XR compared to conventional 3D visualization. Thanks to 3D, dimensional presentation of the product in XR, more realism is achieved. You can evaluate surfaces and gaps better and get a realistic idea of how the product will really look like. As a user, you are more involved and get a better emotional understanding of the product. Thanks to fully immersive tracking technology, you can conduct realistic ergonomic studies.

      Compared to traditional CAVE, for example, XR is less expensive. It takes less space. And it can be used more flexible, such as on trade fairs to gather feedback or present new design ideas. Doing a virtual collaboration, you have improved decision-making. You can combine the real-world with the virtual and take advantage of both.

      Now I come to the first example evaluation in VR. The setup is a head-mounted display connected to a high-end PC with good graphics card. A simple tracking of the body position and head movements. It can be inside out tracking, as you know it, for example, from the HP Reverb G2, or outside in tracking, such as the HTC Vive with Lighthouses offers.

      Optional, VR controllers to track your hand position and to present your hands in virtual reality. This leads to an even better immersion. Use cases are the evaluation of surfaces and materials. Plus you can visualize the light design of the car and view it in VR, something a clay model, for example, doesn't offer.

      The second example is about evaluation in MR. Starting from MR and interior, the setup is an HMD, complex tracking of body, head movements, and hands, plus a seating buck. A seating buck is a frame to which the steering wheel and a seat are attached.

      Additional, 3D-printed parts of the car, such as dashboards, center consoles. And those can be mounted to the seating buck. This provides a haptic representation of the surface in mixed reality so that the user can touch and see the design, not just see it.

      Use cases are the evaluation of materials and surfaces, HMI concepts, ambient lighting, and ergonomic studies. For HMI design, you can test a click dummy program of HTML5 or Qt Designer directly in your VRED sim, event tracking enabled and the physical representation of your car, such as 3D-printed surfaces of dashboard and doors. You can explore the whole interior, including display content.

      Also, the ambient light and illuminated icons of buttons can be viewed in mixed reality, if there is a night scenario of your design prepared in VRED. You can also calculate realistic light simulation if you use the new light making capabilities of VRED, as well as animated welcome scenarios. Test the accessibility of gearshift sunshades and storage trays, all virtually with realistic materials applied.

      The setup of an evaluation in mixed reality looks similar, with the difference that you don't need the seating buck. The need of a physical representation depends on your use case. Heads are less important in exterior. And the hand tracking can be done by VR controllers.

      Use cases are the comparison of filter with a virtual. You can view and evaluate virtual facelift parts on a real existing car before the design changes go to production. A virtual car can be viewed in a real environment. Or you can evaluate virtual materials on a physical representation of the car, such as a clay model. With advanced hand tracking methods enabled by [INAUDIBLE] and [INAUDIBLE], you can also view real hands in VR.

      Next I want to talk about virtual collaboration. What you need for a virtual collaboration is a high-end PC or laptop. Additionally, an HMD for joining the collaboration in VR. And of course, a stable internet connection. The benefits are easy access for any persona. Also, known visualization experts can access a virtual collaboration with a prepared data set.

      Participants can access from anywhere, no matter where they are located. You can collaborate with your team in the United States, Europe, and Asia, and share your ideas. Several experts from different departments can collaborate simultaneously on the same data set. You can take another point of view and explore the product together. This enables a closer collaboration.

      So finally, let's talk about cloud-based workflows. A part of your device, which are using a fast wireless internet connection, such as 5G mobile network, is the most important aspect of your setup to join the cloud. The main benefit is that there is no need for expensive local hardware anymore.

      Users can access from multiple devices, such as tablets, mobiles, HMDs, or laptops. If untethered devices, there is no need for annoying cables anymore. The content can be streamed directly to a wireless head-mounted display. No [INAUDIBLE] or cables. No complex setups with external trackers and a special mounting on the ceiling to deal with the cables. No limit of space, so user don't get put back by the cable attached to his HMD on the head.

      Cloud-based workflows enables a broad participation. User can access from anywhere and are enabled to participate from home offices. Maybe it is difficult in some areas, such as Germany. We don't have the best wireless right now. But I'm pretty sure this will also change in the future. Your vision can be shared with anyone, no matter if we are talking about customers, colleagues, or other stakeholders.

      The possibilities of sharing your designs are endless when the power can be scaled on demand. If you want to evaluate your VRED scene with realistic reflections, lighting, and shading, you just need to unlock more random nodes. And you can evaluate your design in real-time GPU ray tracing. Or if you have a board presentation, you can run the-- an offline animation overnight and present it the next day.

      So now I come to my conclusion. To sum it up, XR is a broad field that involves different technologies and personas. Extended reality is connecting different departments and people across the whole product lifecycle. VRED as a software has been a pioneer in product visualization for over 20 years and will continue to develop state-of-the-art solution for extended reality.

      At Autodesk, they are working closely together with our customers and partners to develop the best tools for XR-based decision-making in automotive. Thank you very much for your attention. And I'm looking forward to answering your questions in the Q&R session afterwards.