Description
Customers in the AEC and Film & Television industries are transforming traditional workflows to accelerate the production of linear and non-linear, interactive content. From using virtual production technology on The Mandalorian, to building a digital twin for Wellington City for public engagement, we’ll explore how real-time technology opens the door to a new world of creative opportunities. This session will dive into how Datasmith is connecting the dots with Autodesk solutions like Forge, Revit, 3ds Max, Navisworks, and Maya. You’ll find out how Twinmotion and Unreal Engine are used to collaborate and engage with stakeholders and peers using Pixel Streaming and other methods. It isn’t all about AEC, as the session will also cover how real-time technology is changing the way films and TV shows are made. And while there won’t be enough time to get into all of the technical details, we’ll point you to free resources to learn how to achieve the same results.
Key Learnings
- You’ll find out how Twinmotion and Unreal Engine are used to collaborate and engage with stakeholders and peers
- We’ll also reveal some research we’re doing around digital twins that can really change your game.
- how real-time technology is changing the way films and TV shows are made.
- how Datasmith is connecting the dots with Autodesk solutions
Speakers
- JTJordan ThistlewoodI started my career in Theatrical Design and Production but soon fell in love with VFX and Animation. I was lucky enough to get my start working as a lighting artist on Rolie Polie Olie for Nelvana using Power Animator. Over the years I took on many roles in production before moving to software production in 2012. It is in working with the teams that make the tools that artists use that I found the ability to share the needs of artists. The most exciting thing about the future is that together we have the opportunity to change how stories are told in visual form, connect people, and empower a whole new generation of filmmakers.
- David Weir-McCallWith a background in both architecture and technology firms, David works as the Industry Marketing Manager at Epic Games. In this role, he is responsible for overseeing Epic Games' ecosystem marketing strategy and driving marketing activities in the Real Estate and Architecture vertical. David has gained extensive experience working with architectural engineering firms and design consultants over the past decade. His primary focus in the architecture industry has been on promoting the adoption of new design tools into architectural workflows and supporting the development of digital technologies within the industry. At Epic Games, he continues to advocate for the use of the Epic ecosystem in architecture and real estate and strives to push the boundaries of how these tools can enhance design and bring clients and end-users closer to the design process.
DAVID WEIR-MCCALL: Hello there, and welcome to today's session on how Epic Games customers are transforming how they work using real time technologies. My name is David Weir-McCall. I'm AEC business development manager at the games. And I'm joined today by Jordan Thistlewood. Hey, Jordan.
JORDAN THISTLEWOOD: Hey, David. So I'm Jordan Thistlewood, product management director for virtual production. And we'll talk a bit about what that actually means later on.
DAVID WEIR-MCCALL: Fantastic. So today, we're really going to be talking about both the architecture engineering construction side, but also customers in the virtual production side as well. And this is because the Unreal Engine and Epic Games were used broadly across a number of different industries-- everything from games, to automotive, broadcasting, live events, training, and simulation. But we're going to focus more just on the architecture and the virtual production side for today to keep it simple.
Now, for those of you who maybe are less aware of who we are or what we do, our main goal in this space is about enabling content creation for creators. And we're building this ecosystem and building our family of tools to really help aid that creation. Everything from the Unreal Engine, which is used to obviously create the game side, but also being used elsewhere, all the way through to acquiring and working with Quixel, who have an amazing Megascan library of 3D assets and materials, Reality Capture who specialize in photogrammetry scans and lidar, and then all the way through to the distribution part with Sketchfab. So the entire ecosystem is there to enable you the creator to get your content created and distributed as easy as possible.
Now, today we're not going to be going over all of them, just because we're a little bit tight on time. But we are going to focus a little bit more on how Twinmotion and how Unreal Engine is being used in these two spaces-- both in the AEC pipeline space or the BIM life cycle-- which we're all very familiar with-- but also how it's being used in the virtual production side as well. And with both of these aspects, and with both of these, it's always about trying to merge that digital experience with the physical experience. And trying to bring these two worlds closer together. And try and bring that overlap more than just static 2D images or static videos all the way through to this bigger overlap, which will help you the creator explore more what we like to call what-ifs per hour, to explore those designs and explore those outcomes.
So this is what real time rendering technology provides you. And it's there to be a fast iterative design companion that works next to your design tools. So we're going to start all the way back at Twinmotion. So Twinmotion is our real time architectural visualizations tool. And in a few clicks, this is how we generally describe it. And it's there to work alongside your proprietary architectural tools, like Revit. And it's a way of basically populating and creating high visual fidelity scenes instantaneously. Gives you access to a huge asset library of smart assets that grow and change in the weather, or cars that move, or people that walk around. So you can take your architectural designs and populate them quickly and as easily as possible to then share [AUDIO OUT] customers at the end.
Now, with Twinmotion, we generally focus on these three core areas. And we believe these core areas are probably the most important to the AEC in terms of data aggregation, how Twinmotion links to your proprietary design tools. Collaboration-- how we share your information, or how we share those models and those visualizations with the end users or the clients. And then lastly, making sure that there's this connected ecosystem, making sure that your model doesn't just live there. It can go on and do many more things. So these are the three areas which we're just going to touch on very quickly.
Starting with the data aggregation part. We have this amazing tool called Datasmith, which is built into the Unreal Engine that allows you to take your models from your proprietary tools, like Revit and 3D Studio Max, all the way through into the Unreal Engine and Twinmotion, so you can then start to build on top of that. And this is a very intelligent data importation process. It transfers metadata across. It converts your Revit assets into Unreal Engine assets. So you're trying to limit the amount of rework you're having to do whenever you hit the visualization real time sides.
And what's really great about the data aggregation side, is that it brings in not just one programmer, one software, it brings in many. So you can have, say, buildings from Revit, you can have landscape from 3D Studio Max, you can have urban elements from AutoCAD, all the way through to Inventor and Navisworks. And you can bring these and aggregate them all into a single source so you can create an experience around it. So we work really well with the large Autodesk ecosystem of tools inside both Twinmotion and Unreal Engine.
And this is what Zaha Hadid and a number of different architectural firms are working on. Where they have these different design tools that they're using. And they bring these into the engine to create these very high detailed visual fidelity renderings very, very quickly. And so, it doesn't matter how big your project is. We're very great at handling large sizes. But generally, you can work and populate a scene with people, daylight simulations, within a few clicks. And they use this not only on one project, but on a variety of different ones, from Honduras, which we'll see a little bit later on, into smaller residential urban projects as well.
And what's great about Datasmith and these data importation tools, is that they're phenomenally great at handling size, and scale, and fidelity. This is an example from CAD Center, who have brought in the entire of Osaka city into in Twinmotion. So this isn't just a single building. This is an entire city scale model is brought into Twinmotion, which allows them to create very high quality video animations very, very quickly.
And so, data aggregation-- huge part of what we like to do within Twinmotion. And working with the Autodesk ecosystem is really important to us, which is why later on-- I think actually this month-- we're actually starting our beta program of looking at how we ingest and work with Autodesk for that application within side Twinmotion and the Unreal Engine. So if you're interested in joining the beta, please just email us at forage.epicgames.com. And you'll be able to join up and join us on this journey.
So the next part which we want to cover, is just the collaboration aspect. So how is it that we can collaborate and share our work with others whenever we're using Twinmotion? And we have this amazing tool within Twinmotion called Twinmotion Cloud Presenter, that allows you to take your Twinmotion model and publish it online using a technology called Pixel Streaming. And this Pixel Streaming takes the size, the scale, fidelity of your model, retains it, but runs it off a cloud GPU.
And then, it means that on any device, you don't need a gaming laptop. It can be a tablet, it can be a 12-year-old laptop. You're able to actually experience and look at that 3D model in all its detail. And so, the ability to share these models is paramount to why we don't build them up visually, and how you can best get an idea across to the end client.
And it's what we see happening with a number of different architectural firms. And Big recently developed an AI city for Terminus Group in China, where they took in all or aggregated all the different sources inside Twinmotion, and built this video animation showcasing what this future, and building, and city would look like. So the tools for collaboration doesn't just stop at a picture or a video. But it can go all the way through to be a cloud sharing application service.
And so, lastly, the big thing about Twinmotion is its connected ecosystem, and how you work from your BIM applications all the way through. And what's been described as a model afterlife, where you can create these quick architectural visualizations in Twinmotion, but that model can then live on, and go on and do something else, and be more advanced inside the Unreal Engine itself. So we have this pipeline allows you to work from your BIM applications like Revit into Twinmotion, and then Twinmotion into the Unreal engine. That not only helps carry your model through, but also helps connect teams from architects and engineers to work better with visualization and design technologists.
So there's a whole ecosystem behind, which we spoke about at the beginning of the session. KPF are just one of those who utilize our Quixel asset library, those high fidelity, high quality assets that we spoke about from Quixel inside their Twinmotion models to create these quick visualizations for the customer or the client at the endpoint. And then, make those quick design iterations that link back to their proprietary design tools. And then, take that model into the Unreal Engine so it can live on and be something else, whether or not it's a more collaborative experience, or whether or not they want to make a digital twin.
And this leads me nicely onto our Unreal Engine side. Now, the Unreal Engine is the game engine platform which is used by our games industry to create an amazing host of games from Fortnite to Infinity Blade to Gears of War. But it's also being used elsewhere throughout the industries. And we generally come to talk about it as being this advanced 3D real time creation platform. So in the AEC space, it's where you take your model and you add that extra layer of creation or control over your models. Whether or not that's a collaboration tool, where you can welcome multiple people into the same space at the same time from across the world, all the way through to integrating IoT platforms into the engine for creating digital twins.
In fact, there's a whole host of different areas that we see people using the Unreal Engine. There's everything from virtual selling within real estate, to collaboration, to open world, to perfect pixels and training, all the way through to digital twins. So I want to take just the remaining time just to run through a couple of examples in this area, starting with virtual collaboration. So it's the ability to be able to work with people or share projects whenever we're not able to be together, which is very prevalent in this time and age.
And this is exactly what Squint/Opera built using a tool called SpaceForm. SpaceForm used the Unreal Engine to allow people to mingle and be together in a virtual space, but actually be able to dissect, and go into the model, and view a variety of different things from different data sets, to different design options, but be inclusive that you, and me, and Jordan can all be in that space at the same time, exploring the exact same outcome. And this idea of collaboration is what game engines are built for, is bringing people together for collaboration and communication purposes.
The next one is a slightly newer one, which is around digital fabrication. We're seeing more and more around this where people are using the Unreal Engine to create a front facing user interface that allows users to interact with their designs and actually work with a number of different prefabricated panels to design their future homes. And this is exactly what Zaha Hadid has built in Honduras. They built this prefabricated building configurator, where the end user through a web application can go in and customize their own home, that then links with that digital manufacturing backend. So that they can then get the price, or actually get these models printed, and actually sent to site or constructed and sent to sites.
So we're seeing a lot in the digital manufacturing area. But the other side, which again, plays a huge part in what the Unreal Engine is built for, is this idea of perfect pixels. This idea of actually creating these very high end visual fidelity images. And this is what Jacobs use for showcasing even things like the new dual carriageway at Chelmsford link where you can go on and actually view what the infrastructure and transportation route would look like to give context to their designs.
And what we're very excited about seeing what Unreal Engine 5 is going to be able to produce, as we incorporate things like Lumina Nanite, which allows real-time light baking, so you can create these high fidelity high quality scenes, and modify them in real time without ever having to re-bake your scene. So we're very excited about the different visual fidelity aspects which are coming with Unreal Engine 5.
I'm honestly-- just to finish up-- we want to talk a little bit about digital twins. And whenever we talk about digital twins, we talk about the idea of having this 3D model of a physical asset, but with some form of live or continuous data coming into it. And so, it's going to have this IoT link. And the reason that we see people utilizing the Unreal Engine in this space, is because of the engine's ability to contextualize data-- to have control over how that data is ingested and viewed by the end user. So in things like CFD or wind analysis, you get complete control over how the end user experiences it and views it.
On the different components of that is really, we've already seen the data ingestion part with tools like Datasmith that link into your tools like Navis and Revit, and 3D Studio Max. So we already have the 3D digital version of the physical asset. What we now need to add to that is the IoT integration to make sure that we can take in that live data, regardless of where it's coming from-- AWS, or Azure, or your own personal data hub. And then, the Unreal Engine creates this user interface wrap around it. So you can customize how the end user views that digital twin.
And this is exactly what Buildmedia have done with their digital twin of Wellington City, where they've taken the 3D models and GAS and BIM models from the city, mixed it with IoT data from the Wellington City Council, and created a front-facing public digital twin that end users can explore and see these data sets in context. And that can be everything from bus routes, to what parking is available, to when flights are coming in and out of the city. It's a public-facing engagement tool which just mixes those three elements, which are the 3D model, the IoT, and the user interface all in one.
And if you're interested and excited about utilizing or exploring digital twins more, I'm excited to say that there is a great tool which is under production by WSP, Microsoft, and ourselves, which is going to link up the Unreal Engine to the Microsoft Azure Digital Twin platform to allow you to take the complexity and the sophistication of Azures IoT platform, and mix it with the control and visualization abilities of the Unreal Engine. You'll hear and see more about that later on in the year. And so, I'd love to take this moment and hand over to Jordan, who is going to take over and talk about the Unreal Engine in virtual production. Over to you, Jordan.
JORDAN THISTLEWOOD: Thank you, David. That was really exciting. And what I'd like to share with everybody is, just keep in mind, that as we go through the section about virtual production, that everything that David talked about in terms of Epic tools, in terms of data aggregation, the visualization, the collaboration, both in the virtual world and within the editor sessions, things like that-- all these techniques are available to you in the virtual production workflows.
And this is the exciting thing, is that the tools that get used in architectural engineering construction are the same tools that are used in film and TV production, that are used in live entertainment. There might be more of an emphasis on certain workflows and certain tools depending on the industry. But effectively, you had this wonderful giant large melting pot of influence that basically creates better and better features that allows for more workflows.
And so, as the fidelity of architectural visualizations increases, that's being informed by the demands of high end film and TV, and so on, and so forth-- data scalability, and all those things, informing the film and TV business from AEC. So there's a lot of cross-pollination of these ideas. And there's the common thread of that there are a group of tools between Autodesk and Epic Games that bring all this together and form these workflows.
So let's take a quick peek. And when I said earlier, that I'd explain what I meant by virtual production earlier, here's really what we consider virtual production. It's the use of real time tools in animation, visual effects, broadcast, live entertainment, and in-camera visual effects. And when we talk about in-camera visual effects, we're really referring to when someone is using a wall of LEDs in any kind of configuration and shooting against that to create digital backdrops, and extending into a virtual world with their physical sets. So it's the exciting latest trend in visual effects and filmmaking. But it's also an exciting area that has been sort of investigated by many people in the AEC business as well.
But today, really what I want to talk about is the similarities in animation visual effects and the in-camera visual effects side of things. So if we look at a very abstract or distilled view of live action production, you can see that Unreal covers a wide range of the possible workflows, allowing for basically cyclical collaborative development of ideas shared through a linear flow from the start of a project to the end of the project. We look at animation production-- different take on the same idea.
But again, Unreal Engine spanning across a number of collaborative areas which were previously would maybe have been segmented into separate departmental steps through a production pipeline process, which creates its own friction in terms of having to make a choice, discover something new, cycle back to a previous step in the production with a different group of people, that may be using different tools, to then cycle forward. Working in a real time tool like Unreal Engine allows for a more collaborative cyclical development, where multiple parties can be working on the same project at the same time-- all collaborating and contributing their contributions.
What will stand out in the workflows that we'll talk about, though, is that dominantly the tools that are going to be used are Unreal Engine, Maya, and MotionBuilder. And MotionBuilder comes in from the motion capture side. This is either for tracking cameras and feeding that information into Unreal. More often than not, it's about actual performance capture. This is people wearing specialized suits with either tracking markers or active tracking within the suit itself. And their motion is then recorded, processed, through MotionBuilder and sent into Unreal Engine for use. And this is the interesting trend where a lot of animation production can find itself through this becoming more like what we know as traditional filmmaking. So an exciting world that we're in right now.
So let's take a look at animation production. And we've now had CG animation production around for long enough, that we've kind of come to expect there's a certain level of style, of aesthetics, that blends between illustrative and photorealistic, pulling out the right choices in what people want to accentuate based on the look, the creative, and the story that they're trying to tell. And here, we have an example of a great project that has very much of a blend of a painterly style, but with a stop motion aesthetic. But again, created in full CG, executed within Unreal Engine.
But we're not limited to just those near photorealistic styles. You have the ability to then expand into more 2D stylized looks. Again, but still all in real time. And this is the really exciting part of this. So be it architectural visualization or animation production, the story you want to tell, be it about a building, large scale project, or an animation story, you have the ability through these tools to tell the story in a way that will communicate exactly what you want to share.
Now, we'll take a look at one of our projects. We collaborated with Weta Digital in New Zealand. They produced this Meerkat short. So we'll take a look at some of the areas where there's overlap in collaboration. First off, I wanted to share with you, what does it actually mean for an artist working in a tool set like this. And the idea of the perfect pixels that David shared, whereby you end up being able to consume and make choices on final pixels. You can dial up and down if you run into performance constraints. But effectively, you do have access to all the final pixels.
And here, we have an example of the Meerkat groom being worked on inside of Maya. And using all the great tools that are there, fed into Unreal, where we're able to dial in the final physical look of the Meerkat himself for this animated project. And in the end, you see it in context. And you have the ability to work from a very sort of focused look of the groom in Maya, through to the in context final pixels instead of Unreal.
And then, here is another aspect, where through the Maya live link system, an animator can work in both Maya for high quality animation workflows and inside of Unreal. Here, you can see that the Meerkat looks at a different shape and volume compared to the fur. So the animator is using the best of both tools-- high quality animation tools inside of Maya with the final pixels inside of Unreal to make sure that he's choosing the right expressions, the right poses.
This is also accentuated by the fact that we continue to do a lot of work using open standards. So here we have the ability to use Olympic caches to import this grooming information from Maya to then drive the animation inside of Unreal-- quite important. We can also use all the great layout tools and workflows inside of Unreal to shape the environment, to dress it, and to share assets.
But using open standard USD, then bring that into Maya itself. And then, with the final result being the ability to create little shorts like this. And this complete short is available on our YouTube channel. The assets themselves are available in our marketplace. So if you wanted to explore for yourself, what is the parallels between the workflows you use in AEC for visualization and working your videos, and those of the [INAUDIBLE] or film and TV space, it's all available for you.
So let's take a look at something that I think is really interesting, which is what happens when you take CG tools that are used in animation, you start setting the target at photorealism, and you start incorporating visual effects techniques, all again, using the same tools that are used for architectural visualization. And to that end, what I want to do is quickly show you this little short film made by the Quixel team in 2019.
[VIDEO PLAYBACK]
[MUSIC PLAYING]
- Adaptation. The ability to learn from past experience. The use of knowledge to alter their environment. These virtues defined how our creators, and drove them to the brink of destruction. But we cannot exist without them. We must save her. What if our creators exists within us? Humanity has always had the potential to recognize its flaws. And choose a better way. Can we save humanity? Was bringing her here the right choice?
[END PLAYBACK]
JORDAN THISTLEWOOD: What a great little film. And I mean, every time I see it, I still am shocked that that was basically over two years ago that it came out-- or just over two years, around two years that it came out. So it was in production before that. And just the level of realism and how much you can buy into this. And again, this was all created. This is full CG. I've got a quick behind the scenes video that we can look at some of the techniques that the Quixel team used to do this.
So they all went on a trip where they did photogrammetry scanning of the environment to be able to build up the resources. And this informs what is so cool about the Quixel Megascan. There, you just saw them using virtual [INAUDIBLE] techniques, where they're actually treating this like a film shoot to get the effect that they want. You can see how you can treat things with a camera and film-like effect, and how you have full control over your lighting and your environment.
And this is, again, all on assets that feed a photorealistic result. And this is all in your hands. And you're able to do this all within this Engine, that you can feed with these great pipelines that David spoke about. And apply the same techniques to your pitch films, to the marketing films, to everything that you need to create that has a similar overlap in purpose.
So what I'd like to do next is to talk about visual effects and in-camera visual effects. Again, this is building up to what happens when you start taking the CG techniques, and you start crossing them with live action filmmaking. And to that end, what we'll look at is actually-- my colleagues will explain it best. What is so exciting, and I guess, the amazing potential of this technology. And then, we'll have a quick wrap up at the end.
But before we do that, wanted to actually call out one of the things that is worth mentioning, is that Twinmotion, because of its ease of use, and its approachability, and its pipeline into Unreal, is finding a strong home with production designers that are working on these live action films. They are using it to build out the total set, and then working with the production teams, being able to decide what's actually going to be physically built, and then what would be digital. And that can be then handed on into teams that use Unreal Engine to feed these LED walls.
[VIDEO PLAYBACK]
[MUSIC PLAYING]
- Movement started in 2016 is a way for the non-games community around Unreal Engine to get together and showcase all their great works. Usually, our build events are done in the live audience, maybe up to 700 people. But what's different about it this year, is we're doing the event entirely digitally, and we're doing it here from the London lab, and recording it all, and then broadcasting it on a platform where people can join and watch the event streamed live.
- Most of what I do as a creative director focuses on the purpose, the job in hand. Which in this case, was taking these guys who are used to presenting to 1,000 people on a stage, giving them something compelling to present to tens of thousands of people through a computer screen.
- So it's been a really good week here at the Epic lab here in London. With the way that the Studio is set up, we've amazingly managed to create different angles, different moods, different movements, all using a dolly and some track that we actually haven't moved for the four days.
- We've got a curved LED main wall, which is made up of individual 500 mill LED modules, totaling in this case about 4 and 1/2 million pixels. And we then display-- in our camera tracking solutions here-- we can project effectively the perfect virtual rendition of our world space onto that wall to get exactly the [INAUDIBLE] and the camera angles, and the perspective that we would have, had we been there in the real world.
- One of the greatest things about it is that everyone can see straight at the camera what you're shooting. It helps the actors, because they can see what environment they're in. It helps the cinematographer massively, because you can see physical lights in the environment in real time, which is amazing, really.
- One thing, which is pretty amazing for us, was to be able to have that first frame, but then adding some very specific lights on top of the car and having some really nice reflection. For us, being able to adjust it by an Engine, and make it that next 10% was really amazing.
- Being at the event is terrific. It was great to see everyone, for a start. But it's also great to actually be working against a live environment, and feel like you're in a world, rather than that all being put in later on.
- I think the big difference here is that we really can put people inside spaces. These aren't just backdrops. We're putting them inside worlds, we're positioning them and moving everything around. There's ways of we could work together before the shoot, build it up, look inside the Engine, see where it's going to be. But then, when we're on set, we can make those adjustments, and we have that flexibility to really tune it up.
- You flip the whole world around, and you take your post-production, and you shift that. And it all just becomes production. And it means, by the time you're here, and you're doing an event, or you're on set of the film or television show using in-camera VFX, the world is being built.
- I'm most excited about the audience seeing this event and not feeling that they're at another virtual event. And I think that's going to carry over into the content, into the narratives, into the stories, and into them. And they will just sit and absorb and disappear off into the information. And that's actually the most important bit.
- I'm super excited about the future, both of Unreal and what we're doing at the Innovation Lab. People now know what it can achieve and are knocking on our door. And they want to achieve lots of things with it. And then, we're able to respond and look at creating benchmark examples and ways of proving certain types of new technique up in Unreal, that deliver and create new and interesting and creative experiences for people.
[MUSIC PLAYING]
[END PLAYBACK]
JORDAN THISTLEWOOD: So how about that. I think it's pretty amazing when you think about the opportunity that you have as architects, that your productions, your designs, could be manifested on a wall. You could actually put actors on a treadmill. You could actually shoot them walking through your building, and present them to your clients and to the world. And that building will still have never existed. But as far as they're concerned, it's a photorealistic manifestation, and it's there. And these techniques, again, all the way through what you have normally in your day to what the film and TV industry uses all the time, all coalescing into one pool of tools for common purposes. Well with that, I'd like to invite David back. And he can tell you all about what we got in front of us right now on the screen.
DAVID WEIR-MCCALL: Thanks, Jordan. So yeah, a great video to lead on to this slide. If you're interested in the technology and the virtual production sets used in the automotive build event, we actually have a build architecture event coming up on November the 2nd. So if you're interested in learning more about Unreal Engine, Twinmotion or digital twins, please feel free to come and join us at the event. There's a little link and a QR code there.
And last thing, just before we sign off is, Unreal engine is absolutely free. It's there for creators to use and download. It's open source. It's available. Twinmotion has got a free perpetual trial license to it. If you want to use it commercially, there's a small fee. But in general, if you want to learn more, get started, just head over to UnrealEngine.com. But I think that's everything from us today. So we're probably going to head over to some Q&A and look forward to taking some questions about both AEC and virtual production.
JORDAN THISTLEWOOD: We'll fast forward in time and see you in a second.
DAVID WEIR-MCCALL: See you in a second.
Tags
Product | |
Industries | |
Topics |