Description
Key Learnings
- New innovations in graphics and VR technologies
- Latest advancements in HP Multi Jet Fusion 3D printing technology
- Graphics and performance tuning for Autodesk software
- New workstations form factors and technologies
Speakers
- SYSean YoungSean is responsible for global AEC and MFG segments, and strategic alliances with Autodesk and other CAD software vendors. Previously Sean was at Autodesk, where was a product manager for Viz, 3ds Max, and Showcase. Sean holds an MBA from Queen’s University in Canada, and he resides in Fort Collins, Colorado.
LOUIS GAIOT: So thank you very much for joining us. We have a dual-company presentation here from NVIDIA and HP.
And from the NVIDIA team, we have David Weinstein, who is the director of VR technologies at NVIDIA. And we also have Andrew Rink, marketing strategist for NVIDIA, as well. My name is Louis Gaiot. I am a product manager for immersive solutions on professional platforms, our workstation platforms, both the mobile and desktop side.
We're going to cover technology as it exists at this point in time and what the near future looks like for technology. We'll look at some use cases for VR, as well, and some solutions that both our companies are bringing to market in terms of addressing some of these opportunities that we see. So enjoy. Our first speaker is David Weinstein.
DAVID WEINSTEIN: Thanks, Louis. All right I'm just going to make sure that this works. Thank you. All right, great. Sorry we got off to a bit of a late start. I'll try and go a little bit faster to get us back on time here. So as Louis said, I'm the director of Pro VR at NVIDIA. So I focus on all of our professional use cases, not so much on the gaming.
But I like to start off by introducing this slide. And what I like to point out about this is that, a few months ago, when I very first started using it, this was science fiction. The idea that you could put on a head-mounted display and see all sorts of fantastic things that weren't really there, that was sci-fi. And, increasingly, every day, we're seeing more and more examples-- demos, cool content that's coming out, things that are starting to look more and more like this.
But the point that I'll be trying to make over the next dozen or so slides is that this isn't magic. What makes this work is, actually, some very cool, very advanced technology. And let's look under the hood and demystify a little bit what goes into making VR work.
So before I jump into that meat, though, I just want to make sure that we're all on the same page in terms of terminology. So we get this question a lot when we're out talking about virtual reality and augmented reality and what's this mixed-reality thing.
So virtual reality is any time that you're using a head-mounted display that completely replaces the real world with what's showing up on this little screen that's right in front of your head. That is virtual reality. All the content is synthetic. You're immersed in this rendered, this make-believe world.
Augmented reality is best exemplified, maybe, by this picture here, where I'm using the see-through camera in order to see the world that's in front of me. And then I'm adding some supplemental information on top of it. So this is sort of like a heads-up display. This is information content that's being added into the real world.
And then the lines blur a little bit at mixed reality. So at mixed reality, we're putting together synthetic content seamlessly into our real-world environment. And this can be presented within a head-mounted display or more of an augmented reality see-through screen. But the idea is that these two components are seamlessly being blended together. And that's our definition for mix reality.
So there's two phases of virtual reality. And I'm going to spend the rest of my time focused on the VR side of the world. On the one hand, there's what we call personal entertainment VR. And this is the stuff that's been making headlines for the last 12 months or so. This is games and movies and all sorts of really cool, entertaining things that are coming that you will get to participate in as a consumer. And we love this stuff.
But what I'm going to be talking about today is on the pro side. So this is the stuff that you do at work every day. This is the stuff that I'm thinking about, all the professional applications. And, in the interest of time, I'm not going to go through all of these. But hopefully some of you are starting to think about virtual reality in your space. And you're starting to see early indicators of where and how it's going to be used.
So I get the question a lot, I'm a professional, I go to work every day, I have a computer in front of me, am I going to be using virtual reality? And the answer is yes, ultimately, you probably will. But if your job is to look at Excel spreadsheets all day long, it's going to be a little while before VR really has anything for you. But if you're in the design side. If you're building things that are human scale, then VR is immediately going to be useful to you.
And there's really three main reasons for that. On the one hand, things in VR are at scale. So when you're looking at stuff on your monitor, everything is shrunk down. And we don't really know whether your monitor is a 24-inch monitor or a little tablet monitor. It's just taking up a third or quarter or whatever fraction of your display it is.
But when you put on that head-mounted display, everything is scaled to the real world. A car appears car-sized in front of you. A building appears building-size.
And that scale is important, as you know, on the design side. In order to get the ergonomics right-- in order for a surgical suite to feel the right size, so that a surgeon can sit down and go, yeah this kind of feels about right, I can reach my instruments-- it's important to have everything at scale.
The second one is really more of a promise of where we're heading with VR. And I'll talk about this in the coming slides. It's about natural interaction. So these game controllers that we're currently using for our VR experiences are very cool, great technology in them. But it's not a natural way to interact with our world. We want to use our hands, our fingers. We want to be able to naturally interact with the world. And that's coming. I'll talk a little bit about it.
And the third one is on the collaboration front. So if part of your job is that you are collaborating with other people-- and maybe they're in your same office with you or maybe they are across the country or across the world-- probably the greatest promise of VR is that we're going to be able to collaborate together in a way that's just not feasible today. Teleconferencing is great, but it doesn't feel like you're in the same room with someone.
OK, so some of you in the audience have probably been using VR for a very, very long time. And it probably looked a lot like the thing on the left. We have these CAVE, we have these immersive display walls. This was what everybody meant by virtual reality up until about two years ago.
And then something happened. And, all of a sudden, virtual, reality is available to everybody. For $1,000 you can go and be in virtual reality on your own personal HMD. And what I'm going to be talking about over the next few slides is what happened. It's not that somebody just woke up one day and said, hey let's make VR affordable and work for everybody.
So let's talk about the technology. So I'm thinking of it in terms of this roadmap. There's the technology that's available today. You can order this on Amazon. It will show up at your doorstep. There's the technology that's coming soon. This is stuff that people are working on in R&D labs. You're seeing some early examples of it in places like THE VOID. Stuff that's coming.
And then there's the direction of where we're all heading. And that's what I'm going to spend the balance of this talk, is going through this technology roadmap, where we're at and where we're heading.
So there's three-- I'm going to argue that there's three key technologies that came together in order to make the VR experiences that we have today. And the first of them has to do with displays.
So, again, VR isn't brand new, but it used to cost $100,000 to have that HMD on your head. And most of us couldn't afford it. But if you were at NASA and you built it yourself from scratch, you got one of those. And there's Ivan Sutherland on the far right, early photo from, I think, 1970, 1969, something like that, with the first head-mounted display, very heavy. It was called the Sword of Damocles, 'cause if that fell, that would kill you.
So it's gotten lighter. And it's also, obviously, gotten cheaper and more available. So, in the interest of time, I won't go through these. You all know this story. But the important thing to point out is that the enabling technology that made it happen is this. This piece of technology, having a two or three megapixel display that you can put right here, is what made it possible for VR displays to happen. So we leveraged the cell phone revolution, the smartphones, in order to make VR displays. The
One of the main challenges associated with VR has to do with rendering, has to do with how quickly can I generate content. And, again, those of us who tried VR 10 or even five years ago, it was not a great experience. You'd do it for about three minutes. You'd get slightly seasick, and you would take off the HMD and promise yourself that you wouldn't try again for five years.
That's gotten better. And it wasn't by accident. We, basically, needed the GPUs, the things that NVIDIA makes, to be able to crank through enough polygons and enough pixels per second in order to generate a good VR experience.
And I won't go through all the math on here, except to point out that, relative to traditional desktop display, we need to be pushing through about seven times as many pixels per second. And some of that is resolution. Now, we have two displays. And the other part is that we have to hit 90 frames per second in order for VR to be a good experience.
The other thing that you have to do for VR to be a good experience is that you have to have very, very low latency. And this one is actually the thing that causes seasickness. If I start to move my head and then the world swims behind me 40 or 50 milliseconds later, people don't know exactly what's wrong and why things don't quite feel right, but they start to get seasick. You start to get motion sickness from it.
So it turns out that we have about 20 milliseconds. And some people will argue that this is 15 or 10 milliseconds. But it's certainly not more than 20 milliseconds from the time that you start to move your head to the time that we generate-- that there's a new frame that you're looking at.
And that's not just the GPU doing it's rendering. That's also this display being updated. And also the motion trackers giving that updated information to the GPU. So it's a lot of stuff that has to happen in 20 milliseconds. But the GPUs are finally able to do that.
And this is NVIDIA's latest GPU. Our Pascal lineup on the pro side, the Quadro side, this is P5000, P6000. This really started with our previous generation, Maxwell. Those were our very first VR-ready GPUs. So those are two of the pieces. So one, the first one, was the displays. The second one was the GPUs that had to be driving the display fast enough.
And the third part is tracking. So, as I move, something needs to notice that I'm moving and then update the display, so that I'm fooled into thinking that I'm actually seeing a world out there as I'm moving around. So it turns out this is really hard to do. But some clever people came up with a very good solution.
This is the solution inside of the-- so Valve is the partner with HTC. So they made the HTC Vive together. This is Valve's Lighthouse solution. So when you go have a VR experience, if it's with the HTC Vive, there's two lighthouses that you set up in the corners of the room, and those things are what's enabling the tracking.
And I have a short video here that kind of shows you how this works. This is one of the lighthouses busted open. And the key piece of technology in here, maybe I have a laser thing, is this light here. And there's actually strobe lights out here. So maybe this video just plays. Oh, it does, great.
So 60 times a second, this is what's happening. So this is dramatically slowed down. These strobes flash. And then a vertical line is scanned through environment. They flash again. And then a horizontal line goes up through the environment.
So these emitters are passive. They're not listening. All they're doing is talking-- talking, talking, talking, like my children. And what's happening is that the HMD is looking for, is listening for, this light. So this is all happening in the infrared. And based on when it sees that horizontal sweep and the vertical sweep, it can then figure out where it is in the environment.
So you need two of them. You can triangulate the position. That's the magic of how your controllers are tracked and how your head-mounted display is tracked. And it's incredibly robust. There's also some accelerometers in there. But it just works. And it works with incredible accuracy.
That's how we do what we call outside-in tracking. So that's when I'm in a room that has some trackers around it.
LOUIS GAIOT: Quick time check.
DAVID WEINSTEIN: OK, I'll go faster. The other way to do VR tracking is called inside-out tracking. And we, as humans, actually do inside-out tracking. So with inside-out tracking, I'm standing here, I have cameras that are looking out, and when I move, it looks like the world moved. And so, based on that, I know that, obviously, I have moved.
So inside-out tracking is what comes with Project Tango. So this is Google's project. They've put this into smartphones and tablets. And the acronym that everybody uses for this is called SLAM, which is simultaneous location and mapping.
So it's sending out depth sensors, depth-sensed light into the space. Based on time of flight, it figures out how far away you are from things. And if you move closer to those things, it knows that you've moved in that direction.
So in all likelihood, this is the solution that we will all be using for tracking five years from now. It's not practical to set up lighthouses everywhere that we're going to want to have virtual reality. But lighthouses work great for now. And for dedicated spaces, we'll probably continue to have them.
But as we want to have VR and AR experiences moving around the world, you need something like this that you can just take with you, that's built into your display device. It's just an outward facing camera instead. So that's where we're at today.
So when you have your VR experience down on the exhibition floor, which I recommend that all of you sign up for and go do some of those, you're getting the benefits of fantastic display technology, awesome rendering technology, and this really, really fast tracking that can keep up with where you're moving.
What's coming is some of these things, incorporating more elements of realism, bringing haptic feedback, and also bringing audio into our virtual experiences. So let's go really quickly through those now.
So this is actually a rendered scene. It looks real but it's rendered with NVIDIA's Iray software. Recently we taught Iray, which is our physically based renderer, how to save things out in a VR format. And Andrew's going to talk a little bit more about that.
But the point here is that when I go out and I have a VR experience, increasingly, I want it to be realistic. And I think that many of you do as well. And many of your customers do as well. And so there are technologies, there are solutions that are coming online, that are going to make this increasingly interactive.
But the other part of that is that while this is beautiful visually, you also want a comparable audio experience. I don't want to have approximated audio in my environment. And by a approximated audio, I mean something like directional audio, where I know audio is coming from that direction, 'cause when I turn my head, it sounds louder in my left ear. I want something that does physically accurate audio in the same way that I want my visuals to look physically accurate.
And it turns out you can use the same framework, the same underlying engine, to drive audio as you do to drive photons of light. And so NVIDIA has built a solution. It's called our VRWorks Audio, which does full-audio propagation.
So you have a sound source in a room, it bounces the audio off of all the objects, off of all the materials, and you get a physically based audio experience. So if you're standing in an elevator, versus in a large music hall, versus in a hallway, acoustically, you can tell the difference. So that's the audio part that's on its way.
Haptics is really important. People, again, want to be able to touch, interact with things in their environment. Haptics is what makes that possible. There's two parts to it. One is sort of detecting when I touch something, so collision detection.
And the second one is modeling the force feedback. So when I touch it, what happens? Does the object deform? Does my finger get force-push back on it? And what is the texture of that surface? So there are a whole suite of devices that are on their way, that are coming soon, that are going to be able to provide us force feedback inside of our virtual worlds.
This is NVIDIA's library, called PhysX, that can do that same force feedback simulation. It can be useful for driving dynamics inside of your virtual world. So we don't want static virtual worlds. We want virtual worlds where things can break or water can flow or fire can spread. And these simulation libraries can bring that to you.
OK, so I talked about some of the NVIDIA component technologies as I was going through what's coming up. This is a more complete list. In the interest of time, obviously, I didn't go through these. But if you're interested in them, you can Google search on NVIDIA VRWorks and read all about NVIDIA's VR technologies.
So, lastly, really briefly, just a couple of words about where we're heading. So Hollywood has done a really great job prepping us for where we're heading. This is what VR is most likely going to end up looking like, because a lot of the people who are building the VR experiences went and saw these movies and thought it was really cool. And that's what we're building now.
And that's probably not a bad thing. The user interface, the user experience, was pretty well thought through. The thing to point out is that there aren't keyboards and mice in any of those movies. And there's a reason for that. It's a really terrible experience inside of VR to have to flip up your HMD and go find a mouse or keyboard.
And so, again, natural interactions are going to be key here. And it's more than just using your fingers. It's things like speech and touch and gaze. So all of that is coming. And it's going to make for much, much better VR experiences with natural interaction.
And the last thing that I wanted to mention is that people think about and are building tools for how we can do design work in VR. I want to design a car, great. I'll put on my head-mounted display. Things are at scale. We're building things that we're used to using in the real world.
But a lot of us are going to-- all of us, probably, are going to increasingly spend time in VR. And we should be designing for VR. We should be designing interesting scenes and worlds and products and concepts that can live in these virtual environments. All right, with that, I'm going to hand it over to Andrew.
ANDREW RINK: Thanks, Dave. That's a really great look at the technology that's enabling VR. Some of you might be sitting here going, yeah, I've seen those movies. It's all in the future. This is not something that's really about to happen.
But this is an article from a paper from last month, and what I want to point out is that Google's launched their own HMD here. And it's only $79. So everyone is going to be able to afford this. It's not just those far more sophisticated HMDs that Vive and Oculus Rift have out there. So I wanted to throw that out there.
I'm going to talk a little bit about two technology trends that-- Dave's already spoken a little bit about VR, but photorealism in design is another trend that's really reinventing reality for enterprises out there. And I think it's really going to change the way we do business in the coming years.
So, of course, GPUs have been used to accelerate design workflows for many years. Of course, when it comes to photorealism, it's a lot more demand for graphics power. And now, with the adoption of 4K displays growing, because the prices are coming down, actually-- we've got a pretty good-sized crowd here-- I'm interested to know how many of you are using a 4K monitor today in your work? Can you raise your hand, please.
Actually, so that's not very many. That's probably, maybe, 10%. That's interesting.
So we're all here to learn at AU. I'm here to learn. And so I'm curious to know, maybe you can help me out here, I'm curious to know, those of you who are not using 4K at the moment, what's the main reason? Is it because your IT department or your company just hasn't got around to refreshing your monitors? So raise your hand if you think that's the case. OK, OK, OK, all right.
So I was going to say raise your hand if price is your biggest concern. Yeah, OK, a few people. And then raise your hand if you think, I don't really understand if there's any productivity benefits to moving to a 4K monitor. OK, one or two. OK, interesting, that's great. Thank you. I appreciate that feedback.
I'm sorry, I slightly sidetracked myself. But I'm here to learn. So I wanted to get a sense of that.
So I think many of you are using photorealism today in your design workflows, design visualization. But many of you are likely not because the software has been complicated. And CPU rendering takes a long time.
So I'm here to remind you, or perhaps inform you, that that's changing now. With the combination of Quadro and Iray, we now have the ability to have an interactive, photoreal, rendered experience. So while you're working on your models with Iray and Quadro power together, you could actually make design changes to the materials, for example, no matter what model your working for. And in the viewport, instantly see them photoreal.
So that's part of the beauty of Iray. So there are a couple of things. So Iray is very easy to use. That's one of the key things to keep in mind. You don't need to be a lighting expert. Because it's physically based, it's reproducing the natural world as light bounces around in real life.
So it's a simulation of your model. It's not a photoreal picture. So this is a really useful tool for designers and engineers, because they get predictive results, actually, when they make a change. Let's say you're working on this drill, and you want to see what it would look like with a red plastic instead of the yellow. You can make that change in your model and instantly see it with Iray in a viewport, immediately, how it's going to look like in real life.
So it's predictive results. So here we have a photograph side by side with an Iray render of a CAD model. And it's difficult to tell which is which. So take a moment, have a look at those. And then decide in your mind, which one do you think is the render. And I'm going to show you now.
OK, so the idea, of course, is not to fool people with how brilliant and how accurate this is. It's to give designers and engineers a tool to really see, when they make design changes, what it's going to look like in real life. And the purpose, of course, is if you're able to see those changes as you're working on your model, iteratively, instantly, you can iterate more quickly and, ultimately, optimize your design.
So here's a similar view for the building designers in here. So while I'm speaking, take a look at these two, and, again, make the comparison in your mind. And see which one you think is the photograph and which ones are the CAD model.
But clearly, having this ability to have interactive, physically based rendering early in your design process enables you to avoid a lot of surprises you might get, as well as pick up on errors much earlier in the design process, which, ultimately, leads to fewer physical prototypes.
So here we go with the big reveal. Yes, you're all right, exactly correct. That was the one.
So I mentioned earlier that Quadro and Iray are combining to do this. So since I last spoke about interactive, physically based rendering at AU a year ago, Iray's been natively integrated into many of the CAD tools that designers are using out there.
In fact, the ones I'm probably not supposed to mention here at AU, so SOLIDWORKS, Catia, Siemens NX. So that means that there's thousands of engineers and designers that are now able to get this capability while they're working on their models.
And the other thing I would mention is, NVIDIA does offer-- sells-- plugins for 3ds Max. Of course, Iray's in 3ds Max natively already. But there are plugins for Max in Maya and Rhino. Maybe some of you use Rhino for industrial design. So that means that there's millions of designers who can use these plugins to get interactive, physically based rendering as part of their design workflow.
So this is one of the key trends that's impacting photorealism in design. And there's one other tool at NVIDIA, just to help you guys understand what's available out there. Iray Server is a distributed rendering software. And this enables you to harness the power of all the Quadro-equipped workstations on your local network. So there are two modes that can be used.
One is the batch or the cued rendering. When you're finished your design, and you want to see what it's going to look like, final-frame rendering of the final vision of it, you can connect the machine that you're working on to all of the workstations in the network and very, very quickly produce that final-frame render.
The other version, of course, is interactive streaming, where you connect your machine to one other machine on your network, not usually the most powerful machine, and you get the power of that GPU to accelerate that rendering in your viewport. So you have that interactive experience I was describing earlier.
OK, Dave mentioned this. We were very excited about this. Iray has come to VR. So now, the capability exists to have a photoreal, stereoscopic panorama for VR viewing. This is really cool stuff, massively computationally intensive. But this is really the future where we're heading.
And speaking of people who like the future, our CEO Jensen Huang showcased this at GTC. GTC is NVIDIA's annual user conference. Has anyone been to GTC here? Oh, gosh, wow, you guys are missing out. This is a fantastic event. We get about 6,000 people in Silicon Valley. They come to hang out with us.
And we actually have a product and building design track for folks like you who get to hear a lot of really innovative companies talk about how they're using GPUs and VRM, photorealism and everything. All the leading edge stuff that's happening gets talked about there. So if you can make it, it's in May in Silicon Valley. So that would be great. Love to see all there.
And so our CEO showcased Iray VR during his keynote. And so we're in the midst of building a new headquarters in Santa Clara. And the architects were tasked with designing a building that maximized the use of natural light. So we wouldn't have to switch on the lights during the day and make it as efficient as possible.
And so the architects were Gensler. And they did a fantastic job in using Iray to position the skylights in the structure. And there's 5,000 lights inside the building. So they were able to use Iray to make a lot of design decisions to optimize the design. It's come out really great.
Of course, the building is going up right now. I don't think it's going to be finished for another several months. But it's really taking place. Anyway, Jensen, he likes to look at all this future stuff.
OK, Dave touched on some of the key benefits of VR. So I'm going to double click on a couple of them. And the first one is the sense of scale that you get with VR. It's just not possible to replicate that on a computer monitor, even if you're looking at a large 4K display, not withstanding the powerwalls, that sort of thing.
So when you're able to see your model in life size, it really can help your design decision making. Often, designers are working on large objects-- planes, trains, automobiles, manufacturing plants, buildings, all that sort of thing. So to be able to see that life size and move around it is a powerful capability.
And then when it comes to design reviews, of course, it's the same thing. When you're able to sit inside, be inside, immersively inside the object that you're working on or a building walk-through, this really does help improve your design reviews.
And then outside of the actual product design or building design part of the workflow, there's use cases for VR with facility planning. You can imagine, if you're setting up your assembly lines in a manufacturing plant or moving industrial equipment around, to see it at life size and scale is a powerful tool.
And the second thing is the flexible viewpoint. So having the ability to view the CAD models from every direction and every angle, that really does help accelerate your design workflows.
And the other thing we hear about is that immersion in the model, it helps non-experts, so clients, executives, people who don't know about these things. It helps them get a more intuitive understanding of what they're looking at.
The other one I like to call out is what is called the digital rehearsals. And so this is where someone who's working on, let's say, doing some service work on a piece of equipment, they're able to do that virtual-- like a virtual practice, if you like, prior to doing the actual work. Or the use case I mentioned before about placing the assembly lines in a factory. So all that sort of thing can be done virtually, digitally, prior to actually doing the action. So that saves a lot of time and cost.
And then Dave talked a little bit about, we're using VR loosely here to encompass sort of AR and MR, but AR, arguably, may be the fastest growing part of this area, this technology, if you like. And it brings a lot of benefits when you're able to overlay digital information like numbers and schematics in front of a real-world environment.
So for instance, looking at components inside products, if you've got an IoT-connected device, you could pull up digital gauges and check on the temperature of the piece of equipment that you're working on or the fluid levels and see if there's been a variance in temperature to make good estimates about failure rates so a lot of possibilities with VR as well.
And then, finally, the more efficient collaboration. Today, with globally dispersed project teams-- I'm sure you're all members of those kind of teams-- you've got somebody working in Japan, somebody working in the US, for example, and they're able to put on the head-mounted display and together, collaboratively, look at a model. And then using virtual flags, they can annotate design changes that they want to call out. So it becomes a really effective way to collaborate in that common immersive environment.
And naturally, that does mean that this gives users the ability to avoid having to travel to a physical location to see a model when they can have that immersive experience from wherever they are. So that, obviously, reduces costs and time.
So the last thing I'll touch on in this particular case is this idea of virtually monitoring variance while you're doing building construction. So you can see the difference in, let's say, the timeline or the structural elements between plan and actual.
And to illustrate that, I'm going to pull up this view of-- this is the headquarters of NVIDIA that's being built that I mentioned earlier. So here we have the view of the VR user from the head-mounted display. You can see the handheld controllers in front. So that gray triangular shape is our building. It's roughly in the shape of a pixel. And you're looking down on it.
So this is photogrammetry taken with a drone flying over it to capture a point cloud. And as you come down to ground level, and you can enter into the building, there's a LiDAR scan here of the interior space to create a point cloud. And so you're able to see if you're progressing with construction the way you anticipate against time line and see what sort of variances might be in the structure. So another powerful way of using VR.
OK, I've spoken mostly about design. But there are other elements. I've touched on a couple of these, like the facility planning. And the last thing I really want to call out is the virtual showrooms. This is definitely becoming much more prevalent.
A lot of-- I wouldn't say a lot, but there are several now, leading automakers that are using virtual reality to give their potential customers a better buying experience. So you can imagine if they don't have every single model on the lot, the customer could come in, put on the head-mounted display, sit inside the vehicle virtually, look around, see what the sightlines are like, maybe from the rear seat. They can, of course, change the seat fabrics or leather and see what it really looks like. Step outside the vehicle, look at it from all sorts of angles. Change the wheel options, see how they look. Change a paint job.
So really, it's a terrific way-- some of these leading edge companies, actually, I mentioned GTC, so Audi came to GTC this year and spoke about their virtual showroom. So they're experts at this. I'm not an expert in it. I mean, I know quite a lot about it, but so to hear them talk about it is fascinating stuff. And it's really working for them.
OK, I have a feeling that we're not going to have time to look at this, but I do want to just call this out, because this is something terrific. I've left the URL here.
So Projects Own was a rendering contest that, actually it's a bit more than that, but so HP and NVIDIA sponsored this contest last year. It was, basically, several hundred architects around the world recreated the Bank of England as it was designed by Sir John Soane at the turn of the 18th century.
And then that digital [? bin ?] model was used as a model for-- as a data set to do rendering. So we ran this contest. And this company, it's an architecture firm in the UK, used VR to do a light study of the interior of the building.
And I would encourage you, if you're interested and that kind of thing, to go to this, the Project's Own website, and check out this video, because they captured in virtual reality 15-minute increments of how the light changed inside the building on the winter solstice in London, in the UK. So it's not a very long day. I think it gets dark like 4 o'clock in the afternoon. So and you can see a realistic view of how the light affects the interior of the buildings. It's actually really fascinating work.
And speaking of projects, so if you haven't got your own Google Cardboard, we're giving these away on the HP booth for free. So you can come and grab your projects on Google Cardboard. And then you can put your phone in and check out all kinds of really, really cool scenes that are out there.
All right, thanks. I'm going to hand it over to Louis to wrap up. We've got a few minutes to do your section. Thanks.
LOUIS GAIOT: All right. Thank you very much, Andrew. So what we see here, both David and Andrew really expressed some really compelling use cases for virtual reality and the technologies that are coming together to make it all happen. And HP, as a company, we are also involved-- oops, wrong way-- in bringing those solutions to you folks.
You need platforms to drive those solutions for you. And so, at HP, we are working very closely with industry leaders like NVIDIA to bring their building block technologies and integrate them into our platforms to bring you-- and power your realities. We like that tagline. HP can power your realities.
So I'm going the wrong way again. So how are we doing this? If you come down to our booth, you'll see a number of workstation-related products that we have there, both on the mobile side and on the desktop side. And we have built our reputation on three main pillars of activities that we want to bring to you as our customers, as our professional customers.
Those three main pillars being reliability of the platform, bringing the innovation that matters to you guys. Not any innovation, but the innovation that matters to you. And then the performance required to deliver the experience.
So what are we doing in the area of virtual reality at this point in time? So as I mentioned, we have our desktop line. We also have our mobile line. Today, on the desktop side, we are proud to support the Maxwell based, VR-ready GPUs in our platforms. They are the M5000 and the M6000. And you can find them on a number of our desktop platforms.
But what's really exciting is the next generation GPUs coming from NVIDIA, which will be both in our updated mobile series, the mobile workstation G4 series, coming in March, and also the refreshed desktops that we will have supporting the P5000 and P6000 Pascal-based GPUs. And those are VR-ready GPUs.
On the mobile side, we will have our first VR-ready platform in the mobile workstation, the HP Zbook 17 with the P5000 GPU. That will be a fully mobile, VR-ready platform available for your use. So look for these things that are happening.
You can power multiple GPUs in the top-end systems like the Z840. We have that in the booth. So come on down and take a look at it. In fact, the demo that we have is demoing Iray with dual M6000s in the platform. So a pretty impressive set up.
What else are we doing? Well, we are testing this concept. This is a VR backpack. It was first introduced by our friends over on the Omen team. And its a concept product. It's not in production. And we have that in the booth for you to experience.
We are running a Project Soane demo in virtual reality with this backpack. It's an excellent demo to give you an idea, especially if you're in the architectural world. It gives you an instant understanding of the power of VR, to be able to walk around in that virtual space and then to be able to look at all the features, the ceilings, the lighting, the floor, and the textures that are rendered in that environment, pretty compelling stuff.
And then to be able to teleport to other parts of the building and experience that and walk around and do it without being tethered with your headset to some desktop by a 15 foot cord. So you're not tethered with this experience. The entire computing is done in what we call the pill. And the headset is attached to the pill and powered by the pill. And it's on a harness. So come on down.
I am actually there in the demo booth for this particular concept product. I would love to get your feedback from the experiences that you have there. So one other thing I will say about this backpack product, there is going to be a revamped Omen product, the middle of next year coming out.
We've gotten great feedback, both on the professional workflow side and also on the consumer side, for redesigning things like where the batteries are, what kind of harness, and the ergonomics of the harness. So the product that's coming out from the Omen team mid-next year is going to be awesome in terms of ergonomics and other capabilities that it will bring.
Following that consumer product, we are planning a release of a professional product as well, for you guys to be able to do the same kind of work in your professional environments. Now, we see this as being very effective, not only in the design phase, where you can have people review designs, executives review designs, but also for the customer. So you can have the customer go walk through a building and sign off on that building or suggest changes on the building before it actually goes into building phase. And also for training and simulations, excellent vehicle for that not having to be tied to a desktop by a 15-foot cord.
So check it out, come on down. As well, what we have in the booth is another virtual reality display technology. We call it the Zvr. It's in production now. It's available now.
It uses passive tracking technologies on the glass there. And the actual display itself has these emitting infrared signals that will track your head as you wear these passive glasses and represent the 3D image on the display for you. You can manipulate the 3D image using a control pen. You can move it in, move it out, break it apart, and view your model or your building in that way using that product.
So we've got some exciting stuff there and some things for you to see. Come on down at the HP booth. We've got the NVIDIA folks down there, too, demoing Iray and some other GPU technologies that they have there. And I'll just open it up for questions at this time for any of us.
Yes.
AUDIENCE: [INAUDIBLE]
ANDREW RINK: The question is whether the Iray VR is pre-rendered. And the answer is yes. It's massively computational-- it's a massive computational requirement. So at this point-- in the future, it probably won't be, but at this time, they are pre-rendered, yeah. Yup.
AUDIENCE: Iray Server, is that Quadro?
ANDREW RINK: The question is, is Iray Server Quadro-only? And the answer, happily, is no. You can use it with other technology, too.
OK, I think we're going to wrap it up there then. Thanks very much, appreciate you all attending.