Description
Key Learnings
- Understand the future of VR
- Know more about VR ecosystem
- Discuss various VR applications
- Discuss VR workloads and challenges
Speakers
- DSDebra Goss-SeegerI am a Product Owner for Autodesk LIVE, with a background in engineering.
- David MenardDavid is a Software Engineer by training, with Master’s in Virtual Reality, and an MBA from HEC Montreal. After spending a few years working in the video games industry, he joined Autodesk to kick-start the effort around the ambitious project that is LIVE and Stingray, focusing on Virtual Reality. His deep experience in Virtual Reality and real-time rendering technologies has served him well as a Product Owner, enabling LIVE to become the One-Click solution to VR that it is today.
- SRScott RuppertScott Ruppert is the Portfolio and Solutions Planning Manager for Lenovo Workstations. In this role he is defining Lenovo strategy in key areas of emerging technology, specifically AR and VR.
- SDScott DewoodyScott DeWoody has always had an affinity for art and technology. After seeing the animation being done through computers, he knew he could combine the two. In 2007 he graduated with a bachelor’s degree in media arts and animation. He focused on lighting and rendering techniques using 3ds Max software, V-Ray for 3ds Max, and Adobe Photoshop software. A day does not go by when he is not using one of these applications. Image quality and workflow are the top priorities in his work. He is constantly studying color theory, composition, and new ways to produce the most effective possible results. He has worked at M. Arthur Gensler Jr. & Associates (Gensler), for the past 8 years as a visualization artist and manager. He has worked for numerous clients, including NVIDIA Corporation, ExxonMobil, and so many more. Currently, he is exploring the new possibilities of architecture in the interactive space with gaming platforms, augmented reality, and virtual reality.
- Adam JullAdam started IMSCAD over 7 years ago. They have deployed 100s of Autodesk customers both on Citrix and VMware. Over this time IMSCAD have worked closely with vendors such as Intel, NVIDIA, AMD and Citrix to help develop solutions to moving graphical design into the Cloud.
DEBRA GOSS-SEEGER: OK. You can hear me. I can hear myself, so I'm assuming you can hear me. Good morning. Thank you so much for coming today.
This is going to be our thought leadership panel on VR. Is it hype or is it reality? We have some of our top experts in the industry here to talk to you today and provide their thoughts on VR. My name is Deborah Goss-Seeger, and I work for Intel. I'm in the data center group. And I am the product marketing manager for workstations. What that means is I talk to our OEMs and we develop strategies to bring you workstations, Specifically zen which are the ultimate workstations. Hopefully everybody is familiar. Nod of heads. If you're not, I might have to interrogate you or something.
So on our panel today, we have and when I say names, you can just wave your hand. We have Adam Jull, who's the founder and CEO of IMSCAD. And he brings today and the future perspective on how VR will impact graphical applications in virtual environments. We have David Menard, who's with Autodesk live. He's an Autodesk live product owner and he brings software and the industry perspective. We have Gary Radburn, director of global workstations and virtualizations at Dell. And he brings an OEM perspective. We have Scott DeWoody. He's the creative media manager for Gensler Architects, and he brings a unique customer perspective. Last but not least, we have Scott Ruppert, Portfolio and Solutions Manager for Lenovo workstations. And he brings the OEM perspective
We're going to start with a brief introduction, brief, of our panel members. They'll do their personal introduction. And then we have some questions that I'm going to pose to them. These are some of the top questions that we've heard that you guys are asking about. And then hopefully if we have enough time, we'll have some Q&A from the audience, OK?
SCOTT DEWOODY: OK go? So, can you guys hear me? Yep? All right. So my name's Scott DeWoody, I'm the firm wide creative media manager at Gensler, which means I oversee all of our rendering, visualization, and R&D into VR, augmented reality and mixed reality.
GARY RADBURN: Gary Radburn. I'm director of commercial VR inside of Dell. So what that means, it's like setting the strategy, listening to the customers, and working out what offerings we can do out there, and how we can help you to be successful when you have you VR implementations moving forward.
SCOTT RUPPERT: I'm Scott Ruppert with Lenovo. I'm the Portfolio and Solutions Manager for our professional workstation line. So I focus on future technologies where that intersects our professional workstations and spend a lot of time gathering customer feedback that kind of defines where we're going with our products.
DAVID MENARD: I'm David Menard. I'm product owner for Autodesk Live, which is kind of our visualize-- architectural visualizations solution to bring your Revit models very fast to VR. So I oversee production and development of the software and interact with customers a lot.
ADAM JULL: I'm Adam Jull, CEO of IMSCAD. We are a global virtualization specialist, particular Autodesk products. But very much work around the computer graphics visualization space.
DEBRA GOSS-SEEGER: OK. If they cut it back on. OK, our first question. What do you think are the most compelling enterprise usages for virtual reality?
DAVID MENARD: So where I come from, whenever we say virtual reality, it kind of resonates with gaming a lot. But the truth is that most innovation in VR will actually come from the enterprise section. For us, obviously architecture is our first main use case for VR. Its easy. Its accessible. But honestly, every single AEC application, manufacturing, everything that's design, has some potential to have some VR integrate into that workflow. But even past the production, we can see VR being applied in sales, even retail, and anything that any enterprise actually has potential.
ADAM JULL: I spoke to one of my customers yesterday, you know, a fairly large architectural firm who are looking heavily into VR for particularly class detection so they go in the model and they can see with their design exactly in virtual reality how that really looks,0 which is a really exciting development for them, of course.
GARY RADBURN: I can throw some numbers out there and things like that, but it's too early in the morning for things like that. There's a five year out study which is looking at where it is everybody thinks we are today, to the point that it's all about gaming. That sort of really resonates. It's easy to consume. You can do it on a phone. And it's very accessible today. The market's really going to shift into the commercial space and major entertainment is going to form a smaller slice of the pie, compared to things like health care, the commercial aspects. Just to give an example, you're probably aware if you read in the press, that the LA motor show is on at the moment. And we just did a big launch with Jaguar at the weekend for their new electric SUV. And we did all of that completely in virtual reality. All right? So it's the first time that VR has being used in the commercial launch like that. For the journalists coming in, really resonated with the idea of it being a futuristic car, futuristic delivery platform. And if you go out there and do a search on the Jaguar launch, you could read about that and see VR was utilized in a new way, and showing you those commercial aspects.
DEBRA GOSS-SEEGER: And our next question. Where do you see the industry going regarding graphics and remote usages with the introduction of virtual reality?
ADAM JULL: You know from my perspective, we see a lot of customers [INAUDIBLE] this tool. I think that's right. Obviously, it's graphics. Sometimes actually, you know, obviously that can cause its own issues especially if you're not in the local environment. You can talk to me about, oh will it ever in the cloud? It's like well you know, not yet. So it's going to be a long time till that's possible because basically, the internet connections and everything else, but there is still an exciting opportunity in the preparation piece and what people can get done, so yeah. It's yeah. It's huge.
DEBRA GOSS-SEEGER: Next question. What is Autodesk doing with enterprise VR content creation and what are the best VR content creation apps?
DAVID MENARD: So right now whenever you create content for VR, usually you want to go to a real time solution right? So there's a Stingray, Unity, Unreal. All those game engines are the core tech, but then, you know, VR is mostly about content, just like anything else. We have the tech now. It's available. You can use it. But where does the content come from is really powerful, the most compelling thing. So you can use obviously the 3ds Max, Maya to create your classic content creation pipeline, but beyond that, we really want to push the tools that we're all using in production. Things like Revit. Things like Fusion, Inventor even SketchUp. All of those tools will eventually be a source for content for VR.
DEBRA GOSS-SEEGER: Scott DeWoody, adding on to what David just said, what Autodesk VR applications have you evaluated, or plan to evaluate?
SCOTT DEWOODY: Currently, we use Revit as our main platform since we do a lot of architecture and interior design. And so our users use A360 quite a bit doing renderings right inside of Revit. Because you can push those to the cloud and it's a simple flag of saying, I would like you to be a VR image, and then once it's done turning, get to download that and throw it into a gear VR or a Google cardboard and take it to the client. And so we do that a lot now. Now with the release of Autodesk live recently, we're into evaluating that pipeline to see if our projects can go easily go to Stingray. If we can, you know, throw the [INAUDIBLE] on real quickly and getting more immersed, because most people like we said, when you think VR, it is more the immersion a bit more interactive game engine workload than the 360s but the 360s get the job done really, really well.
DEBRA GOSS-SEEGER: OK. Virtual augmented or merged reality. What do you see as being the most successful? You've both got a smile there.
DAVID MENARD: Yeah. Honestly, I kind of see it more as a continuum. There's not going to be a specific gap between the two, eventually. I think your VR platforms will be able to do mixed reality and your augmented reality platforms will be able to do VR as well. So its kind of hard to define which one will actually win, and which one people will actually use to refer to all of these as, but in the end, it's probably going to be a mixed reality workflow in most cases.
SCOTT DEWOODY: Yeah, I'll comment on that too. I don't think you can compare them or compete the technologies with each other. They all have very nice, specific use cases. So, I always so like was it AR was it VR? It's really what are you trying to do? Like, virtual reality is when you want to completely remove someone and put them somewhere else that you can go. Like if you want to put someone on Mars, you're going to use virtual reality. But if you want to bring the models and your design into your reality, into this physical space, then you want to use the augmented reality. And the mix reality is more like augmented plus, where then the models are now interacting with the actual environment more than they would just showcasing something on the screen. And so it's really, it's I don't think one's ever going to overtake the other because you are going to meet all three of them to a degree, depending on what you are doing.
SCOTT RUPPERT: Yeah, I agree. I think especially as devices become more available, I mean I think we're going to see both grow tremendously. And it's really Scott, what you said, it's what you want to do with it, right? Are you designing something completely from scratch, or are you walking through one of your new building designs that there's nothing there that's completely virtual? Or are you walking around a construction site where you want to see some of the existing infrastructure and then what you're going lay over top of that. So I use great use cases for both.
GARY RADBURN: That's exactly it. It really depends what you're trying to get out of at the end of it. They all have that niches and their uses. Again, it sounds trite, but you know, this whole VR, AR, MR is a journey. It's not a destination. So people are like looking at it and going, OK what have we got today? Can I do that? Oh no, it's not up to the task. There's not enough graphics power. There's not enough resolution. There's not this that. It's like all of that is like showing what's possible today, and then things are only going to get better as we move forward. We're using in all sorts of use cases now out there in the industry, things like when you start doing like MR, I had this notion which is like oil rig workers and whatever else, they're bussed out to an oil rig and basically the people who survive, get the job. Now we can actually put them into a mixed reality type environment. We can actually give them the outline of the oil rig. We can put valves there so that when a disaster happens, they know where the valve is. They know how hard it is to turn it, but they're all doing that inside a MR type space. And then my ultimate goal according to me, if I had the power to do all of that, would be like you've got a sliding scale on your headset. So I can set it completely to immersive for VR and then I can set it all the way up to completely translucent, so I can then do a mixed reality type thing, depending on what I want to do and what the task is at any given time. So at the moment it would be great to have an augmented reality where I'm looking at you guys out there and I can see your Facebook profiles next to you. I can see whether people are awake or they're asleep. What your interests are to make it then more pertinent to you. So that would be a great tool for me sitting on this side of this microphone.
SCOTT DEWOODY: You know, it's funny that you said that because that kind of came up yesterday at the talk I was at, and I think Google nailed it with Google Glass a couple years ago. I think they just did a few years too early. Because now if they were to reintroduce that device, I think, because that's what they were aiming to do, they wanted stuff popping up and you could see all kinds of data in reality. And it might be time to revisit it a device like that again.
DEBRA GOSS-SEEGER: OK, since we have two of our top OEMs on our panel, this question is specifically for you guys. What are your VR product offerings and/or use cases that you're targeting?
GARY RADBURN: I'm not going to go into product offerings, because the hardware behind it is an enabler. It's all depends on the content, the use case, and what you're trying to do. But we've done a couple of unique things at Dell to try and help you make decisions based on what you're trying to do. So we've got something as simple as like a VR ready web page. So first off is, what power of equipment do you need? What level of graphics? What level of CPU? What level of storage etc. do you need, so you're not going to bottleneck the system? And so we can help you make some decisions around that to give you an idea of which direction to go in so you're not lost in the swamp.
The second thing we've got is we've got VR centers of excellence, which is where we've got them dotted around the world. We've got six currently and we're going to be growing those. Where we invite customers in as a free of charge service, to actually come and play with the equipment, so you can try different levels of CPU et cetera to see what do I really need? How do I not overbuy? How do I not underbuy? Because they're both as bad as one another in a commercial sense. So we can take some of that guesswork out and work with you to actually be able to do that. Plus of course, we've got the range of hardware to go along with that. But we're trying to make the journey a lot easier for customers.
SCOTT RUPPERT: Yeah. Exactly. Exactly. It's all about the solution, right? We're one of the tools. We're one piece of the puzzle. I think it's all about working with partners, you know, like Intel, graphics, all the different pieces of the headset manufacturers. Making sure everything is tested, compatible, no interoperability issues, VR ready, as we all like to say. And then helping guide you towards those solutions. I think, you know, one of the things I talked to folks about is we're very good at understanding kind of specific apps and Autodesk certifications, right? So you use AutoCAD or you use Revit, there's a specific type of system that I would recommend or a specific workstation that I know for that workflow. But as I better understand, are you doing some 3D visualization? Do you want to take that into a virtual environment? That changes the recommendation that we would make. That changes the system guidance. So like Gary said, web pages that make that easier to guide, or intrapersonal, you know, direct communication is really the best, right? When we can just sit down and talk about what you want to do, and then what solutions do we have to get you there.
DEBRA GOSS-SEEGER: OK. Next question. What is the multicore value prop for VR workloads?
GARY RADBURN: The great thing about the workflow is VR is more of the destination platform, rather than a being and its own entity. Applications today are pretty much the same as the applications were when we were delivering things that weren't VR. So if your application-- Scott's already mentioned the ISV certifications. It's a key point. That's why you use workstations rather than generic hardware. All of those rules still apply. Just because we're doing VR now, doesn't mean to say we throw the rule book out and create a new one. You're still using the same applications to create content. It's just that those assets are now being taken into a VR environment, and so consequently if you are doing CPU rendering or whatever, you're going to need more cores. It's going to be multi-threaded. It really still depends on the application. And we've seen some great work from Autodesk in the future of making things section in the demo hall. If you haven't checked it out, then definitely check that out, because they're making moves now for creation inside of VR. You can't just suddenly take the applications that we've had as monolithically growing up over the years, put a VR front end onto it, and then say, OK, now go and do it. They've got to be created specifically for some of the challenges you have inside of VR, and how you visualize things, and how you interface with things. And I think Autodesk has made some great strides in helping people embrace VR for creation. I think we're some ways off of the entire industry going that way. So in terms of the question and multi-core workloads, still exactly the same challenges as you have with traditional workflows outside of VR.
SCOTT RUPPERT: All I would add is a point of emphasis is really is looking at it as one piece of the workflow, and understanding the entire workflow, and then basing hardware decisions around that.
DEBRA GOSS-SEEGER: So, I saved the big question for last. What are the challenges you see with VR adoption and how it changes the new P platform. Start with Adam?
ADAM JULL: There was a cost perhaps. You know, I mean, I am not an OEM, right? So I'm not going to give it a free rein.
PRESENTER: Can you use his microphone and not that microphone. You guys are on the same frequency. Actually we need to take this away from you. Can you turn on your mic? Is it on? Sorry to interrupt.
ADAM JULL: Yeah, I mean obviously without putting the dampers on it. It is cost to pay to do this sort of stuff and it is quite new obviously, but there is certainly some real benefits to it. Of course, I'm not putting you off. But obviously, you know, it's that the balance of what the really the benefits you're getting from a business point of view and you know for me, that's really the big, you know, decision you have to make because it's all very exciting at the moment. A new world. And yeah, so that would be my take on it.
DAVID MENARD: I want to add to that actually, because cost right now is a big barrier to entry, but as costs go down as hardware becomes more accessible, the main question I get from that prevents VR adoption is still people are still having questions about, how do I do it? And what's possible? People always ask, can I do this in Stingray? Can I do this in Revit? Can I do this in Max? And the answer is honestly always yes. You probably can do pretty much anything you want with the tools that are available right now. It's just a matter of kind of diffusing that information and the workflows to be able to get there. Once that kind of gets ingrained into a workflows, most people will probably pick it up pretty easily.
SCOTT RUPPERT: I think from the hardware guys' perspective right now, one of the biggest challenges is just the broad array of choices, right? Which software piece? Which enabling engine? Which hardware solution? Which headset? All the pieces that come together. And I think over time we'll see some of that settle down. Costs will come down. So that will be more accessible. And for now, we'll continue to strive, you know, I would probably say, both of us, to guide, you know, our customers and help make those choices. Understanding what it is. It's about, you know, inputs and outputs, right? What do you have today and what would you want to get out of it?
GARY RADBURN: From my perspective, I think it's the start of greater things to come. I mean I get questions like, is this the next 3D television? All right, so VR is like being hyped up now and everybody's getting behind it, it's the shiny ball. This is completely different. It's the content that really drives the adoption, the use cases. And we're working with people who, quite frankly astounds me, right? That they're using VR in certain contexts. And to give you an example of that, if we look at things like health care. We're working with partners who are doing treatment of PTSD, for veterans coming back from war environments and whatever, and rehabilitation and things of that ilk. So it really has the power to change. And then there's also a case study in the press which was, there were paraplegics being-- they were paraplegics like 10 years ago. They put them inside of VR rigs and they put them inside of a chassis, and convincing them that they can walk. They looked down. They see the legs. And they're starting to get neurons, which have been dormant for those 10 years, starting to refire. Not full mobility by any stretch, but just showing the power of this solution inside a medical arena is huge.
And so you take that and then you put that into the commercial, which we have we've already given a couple of case studies there. That's really going to start to drive the adoption. The headsets are going to get better. Democratization of VR content is accessible to all. Yeah, if you've got a phone set and a $99 headset, you can experience VR. You can experience 360 video, you know, immersive video. And then it makes you then want to get to the next step. And the next step. Essentially we've taken consumer headsets, and put them into a commercial environment. What's going to happen now is as it gets adopted there's hopefully going be more headsets delivered, which are going to have higher resolution. They're going to have wider field of view. They're going to have better pixel density. So that you've got a much crisper image. So going up to 4K, 8K and whatever else to get more realistic. And of course that's going to drive the power envelope so that the hardware needs to keep pace with the way things are going. And if you look at where VR was a couple of years back to where it is now, the whole development curve haptics are coming on the back of it, so you've got force feedback. Touch something in VR and I can feel if it's squishy or if it's hard. Again, the way the technology is going, is going to require more and more power and you know, I'm just hoping we can keep pace with it all.
SCOTT RUPPERT: Yeah, I don't know what else I can add more to that. I mean is everyone exactly had the same issues that I've seen. I have a lot of users trying to figure out how to do what and really all this offering you guys are using can do VR now. Can you think, well can I do this VR? Can I get it from here to there? The answer is pretty much yes. There might be an extra hoop to jump through or two, but for the most part, with the interoperability of FDX from like Revit to Max, Maya. Any of these content creation tools. You can take that content into Stingray or any other game engine and can get up and running. So I just encourage you, get a Google Cardboard headset, take your phone and start going. Like the A360 renderings, like I said, those are easy, like if you're doing just 360 renderings, you're not doing anything more in these render engines now than you would already if you're just doing a still image. It's just saying, OK, now I want you to render in VR, and it'll render 360 versus just a single camera view, like you would take with a photo. So that's the easiest jumping off point. If you haven't done any of that yet, go do it. Like, you can get on. I mean A360 does it. V-Ray does it. Corona does it. Iray does it. Mental Ray does it. I mean all the engines do it. So if you're using any of these applications, it really is just saying render in VR and then throw it on your phone and you're going to go.
DAVID MENARD: I actually want to have another go at this one. We've talked a lot about adoption for enterprise like us, but there's another, there's kind of the other way around, where adoption for the consumer market is also a very big issue right now, and it's just a social barrier, right? It's like the first time you had your phone on the street, people were kind of looking at you crazy. Well if you wear a headset on the street right now you're kind of going to look really, really crazy. So there's definitely that aspect that has to kind of be broken down. And you know Snapchat came out with the spectacles last week, and they're not VR, they're not AR, but you've seen people willing to wear these bulky devices on their face in public, and I think that's actually a step in the right direction, even though I wouldn't use them myself, it's good to see. It's really good to see.
DEBRA GOSS-SEEGER: Who in the audience is actually using VR in their workplace? That's really good I'll put in a plug for our trade shows downstairs. We have Dell, Lenovo, and Intel that have some really good, interactive VR displays. You definitely want to go and check those out.
GARY RADBURN: Can we just turn it around. So some of the people who is anybody who's not shy to put their hand up, could you share how you're using VR inside your business?
AUDIENCE: I have a question. Real time game engines have such a high standard for suspension of belief, right? Lighting and textures and the renderers are still and you know offline render solutions have made it more real. Are we going to see some sort of a clash between these two worlds. The real time game engines and the VR visualization pipeline versus the traditional renderers being able to render utter realism as far as rendering.
DAVID MENARD: Definitely there's going to be some kind of clash because when you're in VR, the main value that's there is all about interaction, right? And when you want to go for interaction, you have to go for a game engine. You want to be able to manipulate the objects in VR. Obviously, there's a lot, despite it being a time solution, there's a lot of pre-computing going into those scenes. Just a light baking is pretty much the same workflow that you just sent to the cloud, render for hours and hours, send back the texture and then read that from real time. So despite, there is a clash there, there's still a big use case for online rendering. Obviously, if you always want to push photo real, for now, you'll always have to go render for a cloud with those tools too.
SCOTT DEWOODY: Don't be afraid though not to go photo real in a game engine. Like we use we use VR across the entire design process and so. In early schematic design phases, we don't do materials or a lot of lights and things like that. So we leave things light for, we call them light models. So everything is just essentially white. You may have like a darker gray to add some contrast to read the space better, but that sells fast, just as well as photorealism does. And it's actually really beneficial to the designer, when they're working, before it even gets the client, to go, oh, I didn't think about looking at the space like this. Let me change this now. Let me change that now. And you don't need the super photo realism for that. And we've done some interesting mockups and had, using the HTC VIVE and having people walk around, and it's nothing but just white. And that has sold it. So.
DAVID MENARD: That's actually really interesting. When we first talked to clients with live, that was one of the first thing they ever, they always told us. I don't want textures, because it kind of distracts from the main the architecture and the idea I want to push. Just give me white. Give me white with a couple of yo know, so it doesn't look washed out. Just a rendering style. But you don't have to race for photo real at all.
SCOTT DEWOODY: Don't be afraid not to go photoreal if you don't want to or don't need to. It doesn't matter.
GARY RADBURN: It's just a different value benefit as well, so photoreal when you're looking at something, because you're looking at it on a monitor or whatever else. And then when you get immersed, you're getting a different experience. So there's different senses involved, and so therefore your needs and requirements might be different in that space. I mean again, the Jaguar example that just came up. The interior designers apparently are completely separate to the exterior designers. You wanted to get towards photoreal so you could see stitching in the seats and whatever for the interior, right? But it depends on what you're trying to do at any given space. So when we're going through the visualization process of a new car, they wanted to know how much headroom did they have in the rear of the vehicle, so that they could design features in there and not make people too cramped in the back. You don't need photoreal for that. I just need to be able to see that. I need to be able to model it and I can say, oh, I can't do this and this. Then when you get to your final, final, final. Let's now do it photorealistic, we've got the time to do that, it depends where you are in the cycle, as to what your needs and wants are.
DEBRA GOSS-SEEGER: Of the folks that are actually using VR, any more questions? We have a couple more from our real life users.
AUDIENCE: I have a question for David. That was a very useful workflow, where you actually have Stingray also have Revit and that's integrated. My question is, you can do this even right now right? Using Revit and using a unity plug-in the How does the process simplify what exists and can be done now? What is the differentiation?
DAVID MENARD: So right now whenever you want to go in VR, like you said, you can use Unity. You can use Stingray or Unreal and the workflow is pretty much, is very similar. You'll export to the FBX. You bring in your geometry in the game engine. It comes with all of the geometry. So a lot of triangles. It doesn't come with your materials. It doesn't come with your lights. There's a lot of manual work that has to be done. And it's very tedious. And it's very repetitive. Live is kind of trying to eliminate that type of work to leave more space to the creative process. So we try and optimize the scene for you, so you don't have to deal with your polycount. We bring in materials. We place the lights. We actually separate your mesh so that each object, you can end it individually, instead of having one big object in your scene. That way it's easier to optimize afterwards. It's easier to test it. Easier to bring to Max and iterate on small pieces of your scene, and put the focus at the right place.
AUDIENCE: So it's like production inside.
DAVID MENARD: Definitely. It's really to prevent you from having to do those repetitive tasks, because we don't want to take any kind of creative decisions for you. We want to take away the work that we know you're going to have to do. That you don't want to do.
AUDIENCE: I have one more question about collaboration, have a chance to understand a little bit collaboration on the large scale. That was really powerful [INAUDIBLE] being actually immersed in the scene of the object on the scene and actually having multiple people there you know is that possible to actually have it forward scale like walking people?
DAVID MENARD: Yeah definitely. Definitely possible. We talked about this. It's always yes, right? The answer is always yes.
SCOTT RUPPERT: Yes, yes.
DAVID MENARD: I think a lot of these guys will have a lot to say on this as well.
SCOTT RUPPERT: I know that's one of them. I mean a lot more to come, but that's one of the great benefits of the technology too is exactly what you describe. Being able to be immersed in that environment together, multiple people, multiple users. And then you don't actually all even have to be in the same physical location, right? I mean I think we were talking this morning. You've got a great example of it.
GARY RADBURN: Yes. Thanks for the lead, I'll give you the money later.
SCOTT RUPPERT: There you go.
GARY RADBURN: So I might be mentioning it again. I might mention it a couple of times but I was really stoked this weekend at this launch, and just to let you in on some of the inner workings of it. We actually had 66 people inside a collaborative VR environment. So as part of the launch, we had various pods of six users around the place, we had one of them in London, and the rest of them were in LA. There was a scene inside of there. I mean I work in this day in, day out, and I was blown away, because all of a sudden faded to black and you see everybody else's headset and everybody else's controllers around the world. So all 66 headsets and people interacting, waving at each other across the way, there's actually a live audio going through, then video was just beamed in. You have your own individual experience. But the car, which you could rotate on your separate experience but still in this collaborative environment. Absolutely huge. So, to the point, the answer is yes.
SCOTT RUPERT: Yeah, the sky's the limit. And I will say, like you mentioned, there are several really good interactive demos downstairs, inside the exhibit hall and outside in the hall around it that are definitely worth your time to check out.
DAVID MENARD: And actually, from a more technical standpoint there's nothing different to do in VR to make collaboration happen, than what's already been done in gaming. You know multiplayer games have existed for a while now and there's really no or barely any overhead to actually bring in this to VR.
SCOTT DEWOODY: Yeah, just to follow up on that. Anything you've seen a video game do, you can do. Like these engines are built to make these things. And so, even looking at UnReal. UnReal's worked on-- the latest Batman game was done with the UnReal and all that. So the tools are now available to you and the code is there. And it is, it's multiplayer and we were at Gensler, we even set up collaborative VR and did a multiplayer server and got people in and it works, pretty easily. So.
AUDIENCE: Thank you. My question is, as the market expands and the opportunity to move into VR and AR, is Autodesk looking at actually optimizing the geometry inside of Revit so it's streamlines the process going from Revit into any one of the other gaming engines, so you're already going to be exporting out with optimized geometry, because any of you that have worked in VR, you know that the geometry that comes out of Revit is just nasty.
[LAUGHING]
AUDIENCE: And that's why there is such a critical need for that in the same process so that we can go more effectively from Revit, into the game engine, into VR. So are there any plans inside of Autodesk to optimize Revit geometry?
DAVID MENARD: That was kind of the premise of Live at the beginning. It was because the first time we exported to FBX, we saw that. It was huge. Even a small house had tens of millions of polygons and we needed a way to optimize that. So we can brute force it. You can use the automated tools to automate the-- to decimate the geometry. The other way around it obviously is since Revit is parametric, it's basically the equivalent of procedural. The API actually gives you really great control about the kind of detail you want to export. And to push it even further, that's when you want to start leveraging gaming technology to actually be able to manage that complexity. Stuff like LODs so when you're closer, there's higher resolution, when you're further it's smaller resolution. You have to leverage those workflows and start familiarizing yourselves with what games are doing, because they've been tackling performance issues for a long time.
DEBRA GOSS-SEEGER: Any more questions? Back here?
AUDIENCE: Hi. And so we heard a lot about kind of computer generated content. Leica and Autodesk did an an amazing product launch last night with a new scanner. Do you have any comments on reality capture devices as a contribution to the art and what sort of innovation further needs to take place to make those more ubiquitous because it was a, kind of billed as an aggressive price product but it's still $16,000, so it's not that cheap.
SCOTT DEWOODY: I haven't done much work with reality capture, though it is interesting, because we do we do some product design of our own at Gensler. That we're also repping other Herman Miller furniture. I mean, we spec all kinds of things for design and so being able to, if we have physical objects and we need the model of that, then yes reality capturing using photogrammetry or something along those lines is ideal, but we haven't quite found how that optimally fits into our workflow or if we truly need it in design, because usually we're creating something doesn't exist so taking something like this and then putting it in kind of feels like why do a rendering if I can take a photo of it mentality, so I don't know really. It's an interesting-- it's something I'm more passionate about it because I'm just a CG artist and I like that kind of tech, but looking at it from an actual architectural workflow I'm not entirely too sure.
DAVID MENARD: Well, it relates a lot to what you've mentioned before, where you have to find the problem before trying to map a specific tech to it. For us reality capture kind of has its use cases, but at least for now for in my context, like there might be use cases for it for example, to put your building in context. That could be a great use case for it. But until we see people really wanting to go that way, there's no point of actually trying to explore it too much. But you know, there's a lot of solutions that let you do it in VR as well. Like ReMake, ReCAP you know Project Memento is a great way to do it. It decimates your mesh to the point that you can actually bring it into VR.
AUDIENCE: And there's also different capture things, and again, it really comes down to that use model. I mean from a, all the way down from capturing through photos and just like saying I'm going to do that like now put that into my engine, and it's now going to map a 3D object. I can do that. It's obvious. But you have real sense cameras where you can now do depth perception [INAUDIBLE] It's just about how much data do you want to bring into it? How much time do you have? And how real do you really need it? It almost comes back like the photorealistic settings and things like that. How good is good enough? And how much time do you have for these different solutions?
SCOTT RUPPERT: So a good example of that is we we've actually been doing so with Nvidia's new headquarters, we design in Santa Clara. We worked with them, and they flew drones over the construction site every day and scanned it, put it into point clouds. And then they threw it into VR and we were doing construction site reviews in VR via point clouds early on. And they then created a time lapse. And I don't know if the demo is here or not. I can't comment on that. But they did a scan every day and it was cool because anything like time lapse it and you can see from existing conditions, to the site being demolished, to the hole being dug, and everything being constructed around it. It's pretty cool.
AUDIENCE: [INAUDIBLE]
AUDIENCE: So a couple of weeks ago, I read an article that was talking about users, and goggles, and how much time you actually spend wearing them. And when you take them off, sort of how it can bring you out of balance and stuff. And in particular, I think the article had mentioned like Navy pilots who were using them for training. Is that something that you guys have come up against and is that something that you think will be an issue specifically maybe for people who are working at your firms and doing production at all?
GARY RADBURN: I think there's got to be more studies put into it. I mean, as a recommendation, when I go through we have as a for instance, we've got a lot of educational-- that want to use it so they really see the benefit of teaching people about new places. Taking them to Mars, taking them to foreign locations, yeah putting them inside a headset and like having a different learning experience. One of the things is that if a picture paints a thousand words VR paints a million.
All right, so, if it's a case like, yeah, we can really help people there. But then what's the effect? So again, there's recommendations under the age of 13, you should limit your time in there or not use it at all. It's a case like let's be sensible about what were doing and what we're in there and continue to look at it and make sure that we're giving recommendations that we're not going to create problems in the future because it is a brave new world and it's of these things that it's inside the imagination of so many people. You can tell them anything you want. They're still going to go and do it. You have gaming qualifications that say, you can't play this game unless you're 18 or over. All right? And yet, there's still children of 11 years old, that are still playing that game.
So you can give guidelines and recommendations, whether they are enforced or not, again, we'll see as we go forward. But we're certainly very mindful of the effects that we have because as I mentioned earlier, we're having PTSD and the medical ramifications. If you can use it for good in that sense, then it can be used for bad in a different sense. So again, we've got to work out and work with the appropriate organizations to say, yeah we're doing the right thing, and we're making the right innovation moving forward.
DAVID MENARD: So time dilation is a real thing in VR, by the way, it's been studied. It's proven at-- like the time warps when you're in VR. And it comes down to the design and your intention again for us. We intentionally design the things so that you don't want to spend more than 10 minutes in VR, because most people after 10, 15 minutes get dizzy, start feeling nauseous, despite your best efforts, right? We've explored stuff like having the sun slowly set so that when it sets, you're like oh I got to get out now. These little triggers right? That's also why Daydream put a clock in their VR experiences so that you always have that hint of OK, 15 minutes have passed, maybe you should take a break.
SCOTT RUPPERT: Yeah, I can speak to that. I got Google's tilt brush the other day. Has anyone played with that? Yeah, I it's, so it's painting in VR essentially, and I spent an hour and a half in that and didn't even realize it. And everyone's like are you going to go back to work Scott, or are you just going to sit there and play all day? And I'm like, oh no I'll get back to work now. And it's true like I thought I was in there for only about 20 minutes, but I was in there for an hour and a half.
AUDIENCE: So, last week I was at a conference here and the keynote speaker was shared between three people. One was an architect. He got up and made a pretty provocative statement. He said that because of mixed reality, there's a possibility that flat screens and computer monitors will be extinct within five years, because everything will be an immersive experience. I'm wondering if the panelist share that view.
DAVID MENARD: Yeah definitely I think one of the first, in my mind, one of the first kind of ironic things that I see in the future is you put your HoloLens on and instead of a screen, you just have kind of a plastic thing where you want to put a screen. I think we're definitely going to go there, once it gets comfortable to wear for a full day. There's no reason to have a screen anymore, right? You can have as many screens as you want. So it is a bold statement. I think it holds true.
SCOTT DEWOODY: It's definitely going to-- I might look at it as how do you know how print is now with the internet, right? Like, you can still go get a newspaper if you want, but the internet has taken over, like digital content has really taken over. And so it's probably be the same thing. We'll probably have displays. It may be a marked out area, that when you put your augmented reality on the display it just shows up there. But, it's a good chance TV screens and monitors will not be our primary device in the future.
SCOTT RUPPERT: Five years seems aggressive, but.
SCOTT DEWOODY: Yeah, it does.
GARY RADBURN: Speaking as the number one monitor manufacturer. I do have a slightly different view.
SCOTT RUPPERT: I do have some Dell monitors.
GARY RADBURN: It's exactly right what Scott was saying. There's room for everything. Just because there's a brave, new world out there and it's coming out and people will want to consume in a different way. All right, there's always been that option of you know, I'm a traditionalist. I can read a Kindle or I can read a book. I prefer the smell of a book, the feel of a book, the turning the page of the book. Right? Rather than, oh yeah, I can read it on the screen. I do the screens all day, every day and I still go back to reading a book. In much the same way, I think everything is going to have their place. It's going to depend on the usage model, the comfort factor, the accessibility. Do you need something on your face all the time or do you want to take that weight off your head with everything that goes along with it. So, room for everything. And, five years, this was Scott's point, very aggressive timeline. Wouldn't see that happening and I also wouldn't see it disappearing completely. Right? I would say a mix of that.
DAVID MENARD: That's probably a bit more of a realistic view. Honestly, if, have you heard of, have anyone heard of Magic Leap? Like, yeah, that's it. Some of the partners of Magic Leap have limited screens entirely in their offices. So some people have gone there. I don't know how efficient that is, or how productive it is yet, but some people have made that leap.
SCOTT DEWOODY: I can tell you using the HoloLens right down and trying to do email in it, or web surfing is just-- that kind of interaction is not built for augmented reality. I think it's a really, really awesome device, but the user interface of that needs to be looked at, because it's still very desktoppy where you have to click here, click here, click here. It needs to be like, let me grab this, because everyone's first intention when you put on a HoloLens is you want to actually grab the hologram, and move it. And that's not how it works.
DEBRA GOSS-SEEGER: Well they do it on TV!
SCOTT DEWOODY: They did, yeah! Those promo videos are great!
DAVID MENARD: In reality, you're like this, and you click and--
SCOTT DEWOODY: Yeah it's a cool device, it really is. Like it's an awesome. It is an awesome development piece to see where we're going with augmented reality, but it's that's definitely-- they've got a long road ahead, but they're doing it. So. That's to be exciting.
DEBRA GOSS-SEEGER: I'm a Star Trek fan so I just love it when you all to that. In my lifetime I'm hoping.
AUDIENCE: Object of recognition. I think you guys are talked a little bit about it. Like now the most, OK, Autodesk Live and some other things have more information about the model, but many of the VR engines that are there, they're just allow you to let you bump into doors that you can get to handle to open the doors or when you are walking up the steps you don't know that-- it's not the same thing. So how difficult or easy it is going to be in the near term to get that kind of at more or less real experience in there.
DAVID MENARD: We were talking earlier about the workflows and what the advantage of using Live or Stingray is and that's exactly it. Because we own Revit, we know what every single object is, and we can start adding interactions to it. We already do it with doors, for example, we recognize doors, we open them for you, we animate them. We're going to start doing this with pretty much everything we can. Light switches, we already use furniture for example, to control navigation a bit more. And we want to have that data accessible in Stingray and in pretty much every, single tool you have, so that you can use it and leverage it yourself in the way you want to.
AUDIENCE: Maybe that's one a cool marketing thing for my Live as well. Because there are so many other engines that I saw. I think it's not fun. I can tell you. For the first time, if you're going in it makes sense. But if you really want make use of it, that's a very cool thing about Live.
GARY RADBURN: Just another aspect to that just to, you know, fit like [INAUDIBLE] but VR arcades and things are now starting to spring up. And that's when you're actually taking your design and putting it into real life so that if you bump into a wall in VR, you will actually hit a wall. All right? So the room is mapped out. It's the same as it is inside your headset, positional tracking. And they use it for firefighters and things like that so you can actually-- rather than throw somebody into the building, a burning building and set get out of that one. You can actually like, put on your VR headset, you've got map of the house and the room and they go through that whole scene, flames. You can put heat sources down in strategic places within the model. So that when they get close to it, they can feel the heat and change, even though they're still immersed inside of VR. OK, so taking it from another approach, we can really mix reality inside of VR with external influences.
DAVID MENARD: And to bridge that gap, actually, right now when you use a game engine you have simulated physics, and it can go crazy. Like you've seen those videos of stuff flying around. There's actually a demo next to the show floor downstairs using Fusion data inside Stingray. It's really interesting, because it comes down to creating real objects, and putting them in the virtual space, and they interact properly as they would, as you would design it for the real life. So that is a great use case for it, right? And as the platform and the market expands, you're going to get some, the need to actually have real, but I don't know how to put this, real virtual objects? You get the idea, right? So you're going to start to have that need to, you know, there's a lamp on my desk. I need to be able to grab it and have it interact properly. Not with simulated physics.
AUDIENCE: Yeah but, how difficult is it because I think [INAUDIBLE]
DAVID MENARD: Yeah, different levels of interactivity have different obviously, difficulty levels. Simple stuff like identifying chairs that are already tagged in Revit, super easy. Adding interaction is another step. And adding real hinge and you know collision detection is the other step, and we're always going to work towards that. Towards making it easier obviously.
AUDIENCE: So how dependent is the industry for virtual reality, augmented reality, as a whole on the gaming industries application? God forbid they should fail and it goes out of style very fast, how affected are all of you?
DAVID MENARD: I think at this point, gaming could fail in VR and it still would be around because of the commercial business case enterprise use for it. Like for architects, like the second I put them in VR for the first time, it's a holy expletive, and then buy me 10 of these headsets and do this on every project. That's not going to go away, because it is what as designers do. It's like they're making, we're making spaces and we want to be and feel like we're in them ever before and it's an amazing communication tool for clients, because they just get it when they see it. So for at least architecture and design, it's not going anywhere. It's here to stay.
GARY RADBURN: [INAUDIBLE]
DEBRA GOSS-SEEGER: Now that we're all woke, right?
GARY RADBURN: The coffee's kicked in. We're doing numbers. [INAUDIBLE] same five years is like that can be about 11% market is going to be [INAUDIBLE]. Today it's like huge amounts of market going to make up a number. But they reckon it's going to go to 11%. Commercial is going to form about 35%, and then health care is going to form about 18%, education about 9%. So you now see that the media and entertainment side of things becomes a much smaller part of that pie, which is why Scott's point in here. People want it in commercial. People are going drive it in commercial, and quite frankly commercial is going to be the largest market for those engine manufacturers. So even though games might fail, and people say yeah I'm moving on to the next thing because people are fickle like that in the entertainment market. In commercial space, we really see the value and the benefits to drive forward.
DEBRA GOSS-SEEGER: Any last questions?
AUDIENCE: We've actually through my company have seen a major advantage in VR by actually using it instead of for internal visualization, external. And we're developing huge entertainment complexes, where we'll take our artistic sculptures, and put them into our VR environment, so when people are standing in a restaurant looking out the window, they see their surrounding atmosphere. It's not looking at the facility from the inside, it's looking at the outside from the inside. One of the other things that we just pushed recently, my company is very well known for their Halloween experience, we used an immersive VR that when you walked up to a table, you were given the directive to put the controllers down and then a visual interface they told you to go and grab something off the table, there was a physical element. You want to watch people jump? Put them in a virtual environment and have them reach out and grab something and it's really there. So we're using both sides of the VR beyond just the goggles.
DAVID MENARD: That's fantastic.
DEBRA GOSS-SEEGER: So I'm going to give a couple of minutes for you guys to come up here and [INAUDIBLE] ask any questions. But I want to thank our panels, our illustrious thought leader panels for taking our-- thank you guys for coming. And put in a shameless plug for Intel, Dell, Lenovo's trade booths downstairs. Again, we have some really good interactive VR demos. So make sure you go down there and check those out. So unless we have any final comments. Unless we have any final comments, that's it, but our panel members are up here, so come on up here and [INAUDIBLE].
[APPLAUSE]