Description
Key Learnings
- Learn the possibilities of Project Play
- Learn from analyzing amazing examples of product configurators, sales pitches, instructions guides etc
- Learn the basics of this node based editor to create rich VR and web INTERACTIVE expereince
- learn how to present designs and products made with Fusion360, Maya, AutoCAD, Revit etc in browser
Speakers
- TDTatjana DzambazovaProduct manager of Autodesk ReMake (formerly known as Memento), employee nr. 1 and one of the founders of the Consumer group and the 123D product line, previous product manager for Revit, 123D Make, 123D Design. Project lead and producer of the Smithsonian-Autodesk collaborative project www.3d.si.edu<br/><br/>Technology whispering:<br/><br/>FoST talk: http://tinyurl.com/q64pxsg<br/><br/>TEDx talk: http://tinyurl.com/kctwevw<br/><br/>Trained architect with 12 years of practice in Vienna and London. Perennially fascinated by how technology enables a more creative life, she decided to continue her career in the digital design world and has now been with Autodesk for 15 years.<br/><br/>A staunch believer that we are all born with a sense of creativity and the need to express it, she has been telling stories and leading product development teams that transform powerful algorithms into tools accessible by wide range of people.<br/><br/>Tanja has written books, studied 8 languages, and acted in theater and film.
- NJNop JiarathanakulNop Jiarathanakul loves making things with 3D graphics. He graduated from University of Pennsylvania with a bachelor's degree and a master's degree in computer graphics, and spent the early years of his young career in games and animation studios like Electronic Arts, Inc., (EA Games) and DreamWorks. Nop came to Autodesk, Inc., for his passion in WebGL, a technology that enables him to combine both his love for the web and 3D, and bring the powers of desktop graphics to the web. He has been using WebGL since its early days in 2010, and has been keeping up with the community ever since. Nop is now working on the Web Viewer Team as a graphics engineer.
- MSMichael StaubSoftware engineer at Autodesk. Originally from Boston, Michael has had experience in many domains of computer graphics. He worked on GPU drivers at AMD, animation and visual effects at Blue Sky Studios and Lucasfilm, and now a new WebGL platform at Autodesk. He is passionate about the intersection between art and science, and has a keen interest in emerging technologies such as virtual and augmented reality.<br/><br/>Michael is an advocate of open source software, environmentalism, and jazz music.<br/><br/>www.michaelstaub.com
PRESENTER: Welcome. We really appreciate that you made the time at 8 o'clock in the morning in Vegas. That's usually really hard.
So, this session is called Show, Tell, and Sell! And we are three equally-important presenters. Actually, I'll just make the intro. And these two rock stars will show you the product live.
My name [? is Tanja. ?] I am 16 years in Autodesk, usually in product-management roles, so as product manager of Revit, of the 123D line of products, and of ReMake. And with me are Michael and Nop. They are one year in the company, or two years-- next generation, young, kick-ass developers. They've been working on what we're going to show you today, which is Project Play.
So what is Play? It's really a new way to make interactive presentations and experiences for web, mobile, and VR without programming. You've seen all those beautiful configurators on the web and immersive experiences. They usually are hard-coded and very difficult to do.
So, the agenda today is we have 90 minutes. We'll cover-- how did you arrive to Project Play? What is it? What can you do with it? Then, analyze five, six examples that have been done with Play. Then Nop and Michael will dive deep into the project and show you what it is.
We'll talk about workflows with media entertainment files, with Fusion360 files, with Revit files. We'll talk about templates and galleries, how to create VR experiences, and what's next for Play.
If you have been long at AU, you know we always show this slide, a legal disclaimer. Project Play is not yet a product. And for every technology preview, we always say, what the legal language really says is, do not make purchasing decision based on what you see today. We don't commit that is this will be becoming a product the way it looks now. But we definitely are committed to work on this.
OK, so let's get started. We have [INAUDIBLE] a third cable. And are we recording?
PRESENTER: Yes.
PRESENTER: OK, good. So the session is recorded just for those who are taking pictures.
So, how did we start with Play? A couple of years ago, we started the collaboration with the Smithsonian. For those of you who don't know, the Smithsonian are the biggest collection of museums in the world. They have 19 museums, 9 research centers, 1 zoo, and are spread across 100 countries.
The Smithsonian has a collection of 154 million objects. And they're objects made by men, objects coming from nature, beautiful collections. But if you can see from some of these photos, this is not in the museum. This is in the archive under the museum. They simply do not have space to show 99% of their collections. So if you go to a museum, all you see is 1%.
So they realized that, with the rise of the sensors, laser scanning, structured-light sensing, photogrammetry, they can digitizing the collection and be able to get it--
PRESENTER: Let's bring it back. Yup, we need two seconds.
PRESENTER: Oh, no problem. No problem. OK. It's here. [INAUDIBLE]
PRESENTER: OK. OK. I'm good. Thank you.
PRESENTER: OK. This will be hard.
So they realize that by digitizing the museum artifacts, they can put them online and give them to the entire world-- and not only to those who come to museums, but to those who can never make it to New York, or DC, or wherever the museums are. And they started testing with micro-CT scanners, and laser scanners, and structured-light, photogrammetry. At the time when they started, there were not many experts in the world to even ask. So they learned it the hard way. And they finally figured out how to make really high-quality 3D digital replicas.
So then, they came to us and said, you guys have Maya, and Max, and all these phenomenal rendering technologies. Can you help us prepare these artifacts in a way that we can share them with the world? And together we came up with a concept of web tool that requires no installation, that can run from anywhere in the world, from any machine, and that allows anybody to access their collection, and play with it, and interact with it online.
And the project is called Smithsonian X 3D. It is a pilot showing how museums in the future can look like. We picked about 20 iconic objects from as small as a bee to as big as a supernova remnant, objects with different topologies and characteristics for digitization. And we tried to put them in an environment that allows anyone in the world to experience it.
So, here is a preview of Project Play, of Smithsonian X 3D. Basically, we wanted the models to look as beautiful as in real life and not as in Second Life. So you can see it's a high-quality visualization. And it's better on my screen than on the projectors. Unfortunately, they're not very good.
So we try to, number 1, allow users to interact in 3D online, to have split views between orthographic and 3D. To the curators, we gave tools to combine their data of images, and texts, and videos, and audio, and archival materials. To teachers, we gave possibility to teach with, instead of asking the kids to open a book, to open the model from the Smithsonian and say, how tall was the mammoth leg? Or to make some art, as I always say, a sticker for your T-shirt with the mammoth, et cetera.
One other requirement was for the scientists to provide different visual methods for research and then, also, to be able to start sharing with other scientists-- instead of sharing the models, just sharing through hyperlinks and HTML-embed codes of the objects-- but also, to the general public, who can interact and share themselves. And then, we made a bunch of tools that are more specific for the scientists, like slicing and profiling, et cetera.
When we met the curator of the Wrights' Flyer, he spent his entire life on this object. And he had so many materials-- videos, images, photos, drawings. So he wanted to combine them all, and also combine a CAD model with reality-captured plane, and to show them at the same time, and be able to actually look at the machine. We enabled hot-spots and hot-zones that enable additional information to be added to the model.
So, this was a pilot. We looked at higher quality visualization, various visual modes, object comparison, in-browser measurement, slicing, sectioning, profiling, defining hot-zones, hot-spots, and allowing the curators to speak the 21st-century language, which is create interactive tools and share everything that they did.
The success was tremendous. The Smithsonian got so many hits and so many requirements by other museums. We want to have this. What was really interesting, is we got requirements from so many customers, like you-- architects, hotel chains, civil engineers, industrial designers. They were like, we need this. We need this for internal communication. We need to sell our products in a better way, to tell the stories, et cetera.
And, as you know, in our 30-year history, we have been always listening and trying to give you tools to render and to share. At a time when it was only desktop solutions, we were making phenomenal rendering solutions. We were making desktop tools, like 3ds Max, Vis, Maya. And then, we had-- I don't know how many of you have been long with Autodesk, but if the words Autodesk, View, Volo View, DWF TrueView, DWF Composer-- we had over 150 viewers over the years. As technology was improving, we were making new.
And as the cloud became a plausible solution, we started to make online galleries. We made cloud rendering that runs online and cloud-rendering gallery. We make product online galleries, where you can share through the LMV Viewer your projects for experiences. And, as of last year, we have a unified Autodesk gallery because we realized many of you use multiple of our products and want to show in one context on the same project views of different software.
But you always want more. So, even though we have today BIM 360, and the InfraWorks, and LMV, et cetera. We see on the galleries. You're trying to do videos, and add sounds, and this, and that.
You want more. We heard that you're asking for multimedia presentations, for multiple models being interacted with at the same time. You want beautiful presentations. It doesn't matter if it's Media & Entertainment, Fusion, Revit, et cetera.
Then, often you need to connect to databases from your marketing team or from other databases that you have. You want to personalize the experience. Not to look the same, but you want your logo, et cetera. And then, as of recently, everybody is crazy about VR experiences. But they are always difficult to make.
So, listening to all of this and seeing the success of what we did with the Smithsonian, we continued developing that platform. And that is what we call today Project Play. So, Project Play is web-native in browser tool, or editor. It's node-based. It allows you to put any type of digital media in it, and combine it, mix it, and create interactive experiences-- all without programming. The experiences that you make are immediately available on mobile, on the web, and for VR.
So, what can you do with Play? The screen's a little bit chopped off. You can do web art or data visualization. You can do product showcase, a Kickstarter campaign. You can show reality-captured models and even compare them simultaneously in browser, or even superimposed over each other. You can make configurators for products, product experiences for buildings, et cetera, and manuals and instructions.
So this is how it works. Play is online. It's in the cloud. There is no installation. You go on a website. And you arrive in Play.
And it allows you to drag and drop, or load, different type of digital media. Then you can start placing it and arranging it in a scene. Then start defining relationships and interactions between those models-- or it can be 2D, not just 3D, audio, video, 2D, et cetera. And then, you can publish and share online.
From the type of data that we support, you can import audios, videos, drawings, photos, 3D models, panorama images, and scans. And then, that is static input.
And then, we have dynamic input. You can actually connect with external events from a website. So, this is not just a view in which everything happens in the view. You can drive what happens in the view from your website. And we will show you examples. You can connect with web APIs, like Google Weather, or Google Maps, or any API that is available on the web, and with reactive data streams with Google Docs, Excel spreadsheets on the web, Microsoft, et cetera.
And all of that, once you create it, you both author it real-time. So you immediately see what the others will see. And with one click, it's publishable immediately, and experienceable on mobile, on browser, and on VR.
So let's take a look at some examples. I'll sit down. I will share with you, at the end, the websites and where you need to go to see all of that.
So this is one example. Oops. Let me just refresh the page. This is all live online, so depending on the internet it will be fast or not. I'll try to expand it.
So this is one example done with Play. So, imagine you're a Fusion designer. And you want to, online, on your Kickstarter campaign, explain why your product is good, how it works, what's so specific about it, and let people buy it. So this is the 3D model. If your model was articulated in CAD, you can articulate it here. You can show how it works, add some sounds.
It's searching for radio stations. So you can actually really connect to APIs of radios and find radio stations. You can simulate however it works.
And then, for example, you can build these simple UIs where you say, oh, I want the customers to be able to change the color. Or I want them to discover how did I do it. Maybe it was made out of recycled materials, et cetera. So you can create these tags.
And then, for example, you can personalize. This is, let's say, for our AU friends. So whatever your product does, you can basically simulate that.
And finally, let's say, you've persuaded everybody to buy it. They want to buy it. You say, I want to buy it, but I want to buy the white version. And as you can see, it updated an extra price for the white version because it's so special. So basically, you can do this without programming, without being a web developer, et cetera.
Let's take a look at other examples. The next example is our friend Adam Chin and I were thinking of the next-generation Polaroid cameras that, of course, doesn't exist. This is our Kickstarter campaign. And in this one, we wanted to create a Polaroid camera that actually allows you to take--
OK, let me just close. One thing, when you test, do not open many things at the same time, especially not when the internet is not in a great shape. Maybe I should not expand them. I'll try to expand them so you can see more, but it seems like the internet is not going to like it. OK. Sorry? I tried to refresh it.
PRESENTER: [INAUDIBLE].
PRESENTER: This one? OK. Yeah. OK. This is pure internet-connection stuff. So this is our ingenious Polaroid camera that can take a Polaroid picture, or a Polaroid video, or a Polaroid [INAUDIBLE]. So this is, let's say, you are a Kickstarter campaign. By the way, if you articulated your model, you can show how it works here, as well.
And then, same thing-- you can add tags, in which you describe something. These can be linking to something else. We have also found the node manuals. So, for example, if you click this target, you can go to the vintage user manual, et cetera. And finally, you can preorder again. So these are just a couple of examples-- change to the golden edition because that's the very special one, and some different color choices, et cetera.
Moving to other examples, let's move to this one. This is a building that was captured with photogrammetry and laser scanning, using Autodesk ReMake and Autodesk ReCap. And you can simply use this tool to experience the space. As you can imagine, if this is in VR, you can just walk around it.
The next example, I mentioned we can connect with APIs on the web. This simple example connects with the Google Weather and Google Maps. So let's say, I type Paris. It will find where Paris is. It will check the weather in Paris. And it will visualize the data of the weather in Paris at that time.
I mentioned that we also made the template for comparing two models in browser. This came as a requirement by many scientists that we worked with. But as it happens, industrial designers we're also excited to have this.
Basically, when you want to compare two models in browser, you usually have to put two views next to each other. And then, when you zoom in the one, the other one doesn't zoom in. When you rotate the one, the other one doesn't rotate. We make this template where not only that is possible, but you can also change to different visualization modes.
And let's say, like this. And then, superimpose them, and then compare over each other at the same time. And I will show you later how to do that actually yourself.
Also mentioned that this is not about having one view on your website. This Play viewer, or explorer, allows you to interact with the data from your website. So let me just explain then this one.
So let's say this is some design that you've made. Again, there are tags that you've added, et cetera. But here, from the website, I can change the seats in the car. I can add the spoiler-- the front spoiler, the back spoiler-- and change, for example, the color of the car from the website, from an external event that you are doing as a user, which many users find very exciting.
There are many more examples. So I'll just stop on this one. For those of you who are in architecture or are using Revit, often you would like to share the experiences. And your users or owners do not own Revit.
So, the idea is that you can build these interactive experiences where you can make hot-spots to move you from one zone to another in the model. And you will experience this actually on your VR sets. You can set up change options, like changing color, or we had, for example, a possibility to change seatings. Or as I clicked on those different hot-spots, it just teleports me to that. In this case, I can also switch from seating with [INAUDIBLE] to seating with chairs. These are design options that have been done in Revit and then visualized here.
This example is very interesting because actually here, as you will see later, we are using cloud-rendered panoramas for the space. So you don't load the entire Revit model. But we can combine 3D model with the panorama in order to create that experience.
To go back to the examples, just to wrap up, so, what is specific to Play is that it is native to the web. There is no installation, no download, no plugin, no compiling. You create in real time. And it's experienced in real time. It allows multimedia aggregation and authoring. It's node-based. It allows for third-party server APIs to be connected too, through reactive data streams and external events from the website.
At this point, I will switch to Nop, who will now go through-- to Michael, actually-- who will show the basic-- or to you? OK. The core principles of Play the editor. And this you will see later, the editor is just one way that we offer the tool. There are templates that we offer to wrap up the experiences.
And just one last disclaimer-- this is not yet a product. It's not learnable in 10 minutes. Those of you who know node-based programming will pick it up very quickly. Those who don't will require a little bit of time to get in. But that's why we make the templates to make it easier for you. OK.
NOP JIARATHANAKUL: OK. Everybody hear me OK? Higher. Is this better? Yeah. OK. [INAUDIBLE] Good. So my name's Nop. I will be showing you the first [INAUDIBLE] sneak peak at [INAUDIBLE]. And I tend to speak a little bit fast, in the interest of time. But [? Tanja ?] will try and slow me down.
PRESENTER: Yes. He knows so much that sometimes people lose him when he explains. So we will be very interactive here. And you will have time to ask questions, but if you desperately need to ask question, why we're showing something, feel free. It's just because it's recorded for others that [INAUDIBLE]
NOP JIARATHANAKUL: Yes. OK, so again, keep in mind that Project Play, it is all web-based in the browser. So no application needs to be downloaded. All you have to do is go to a link-- play.autodesk.com. Of
And that is really one of our core strengths. It's born in the cloud. The editing is done in a cloud. And when you're viewing it, it's all on the web. So any of your clients that you're communicating with, all they need is just a link. Open in on their phone, their tablet, everything will work, anywhere, everywhere, whenever you want it.
So here's the landing page. All you need to do is hit Launch. And you will be entered into the editor, as you can see here.
So, just to break a few things down, what are we seeing? First we have the assets. Imagine this is like your folders, your Windows Explorer, where all your files are being stored on our servers. So anything that you want to use in our system, you need to drag and drop and upload into our servers. So this is kind of like all of your stuff is here.
Secondly, let's see. Let's go here first. So the blue circles you're seeing here is the hierarchy of the scene. So this is the structure of your scene that you're seeing. These are labeled in blue. Blue nodes you can think of as anything that is physical in the 3D world. So, we can start with the view port. And we have the camera, some objects, the background, et cetera.
And then you look over on the right side, we have the brown nodes. And these guys provide behavior to the blue nodes. So if you want it to have any material properties or any animation, you would need to have a brown node that will talk to the blue node and give it more behavior. We'll go into that in a second, how all that works.
As I click a node, you will see that its properties are popped up in this panel here. And these are real-time, live. As you edit the properties, you will actually affect what you're seeing in the scene that you're seeing live, here.
So this is not an editor. What you're seeing is not in editor view. It is actually the live view that your audience will see. So, what you see is what you get. We were planning to build an editor view later. But right now, we just have a single view. So there's no confusion. So, whatever you're seeing in this panel is what you are delivering.
So, I can come in here and, for example, change the scale to one. And you can see that it's now stretching live, in real time. And I can actually drag up and down like this. And it affects it immediately.
OK. There's a couple of more buttons up here. But some of the more important ones are this icon, which will show you all the projects that you have. I have prepared already a project for this demo. So I will go in here. And it will load up a new project with new sets of assets. I already have two scenes and some assets already loaded.
So let's actually start from scratch and see how we can make something super simple with a model that I've created. So, keep in mind, again, that we're not ever creating any content, per se, in Play. We're always bringing in content into Play and combining them in a way that tells your story. So you're never actually modeling or drawing anything in Play. So keep that in mind.
PRESENTER: We do have primitive images that you can create, [INAUDIBLE] et cetera. But [INAUDIBLE]
NOP JIARATHANAKUL: OK. So let's start by creating a new scene in my folder, called scenes. Call it scene1. And you are greeted with a simple scene that already has some nodes for you to start with. A simple plane. Again, I can always affect this plane, change its position or the size.
And let's put something into this scene. So the first thing-- we need to bring in something, like an OBJ file, is we need a model node. And to do that, you can go to the library here. This is something I haven't shown yet. So these are all the nodes that are available to you. This collection is constantly growing and evolving.
We try to document it on our website, but there is really a lot. And the best way to learn about them is to actually use them. And if you hover over, there is some text that helps you.
So I will need to find a model node. And that's under the geometry category. And that's right here.
So what I do, is I click and drag and place it in the blue side, which is the structure of the scene. The blue node can only go on the blue side. It cannot go on the brown side. If I place it, it will go away. It should give me an X icon, too. But again, if it's a blue node, it belongs in this world, in any of these spots. So if I put it below here, it will be a sibling of this rectangle at the same level.
This side is, if you've used any 3D [INAUDIBLE] before, this is like that tree that you see on the sidebar. Kind of like if you use Maya, this is what that reflects, basically. Just to put things in perspective.
So, now I have a model node. Nothing happened yet because I haven't loaded the model. So let's try a simple OBJ, I would just simply drag and drop this file into this node. And something should show up. That should not show up.
[LAUGHTER]
Right, OK. Well, it worked anyway, so, which is good news. [INAUDIBLE] No, that goes-- yeah. I've never seen that either. OK.
PRESENTER: So it's not part of [INAUDIBLE]
NOP JIARATHANAKUL: So, this is all a bit too large. So I can scale it down a little bit here and move this plane down slightly. OK, so now we have a Forge symbol, for the newer Autodesk Forge. And the cool thing about this guy, is that he is a optical illusion. So if you start spinning him, it will look like he's actually animating.
So, how do we do that? So, what I want to do is rotating this object around the y-axis. And you can see that it's already spiraling. But I want to do this constantly, right. Right now, I'm affecting it's just manually by dragging it. So I need something to change this number over time.
And this is a core principle of Play I am demonstrating here. So every node has a series of inputs and outputs. So this is an input that is node for the rotation. So I need something that has an output that can affect that input.
So I will find a node called Oscillator. So what I have just brought up is a shortcut. So you can hit space bar and search for anything I need. I showed you that you can also go back here to actually look for your node.
So what I want is the Oscillator, which I can never find in here. So the easiest way is to actually search for it-- but that is, if you know you're looking for. So there is an oscillator right there. So I can drag it in from here. Or I can open this up and type in "oscillator" and do it this way, as well.
So the oscillator, what it does, it's only job is to actually output one number that is constantly changing. Now, how that changes over time, I can have full control over all the parameters. But we don't have to go into that. You can play with it yourself.
So I will connect the output of the oscillator into the rotation. Or let's just try the position first. So I put it in the X, so now it's moving from 0 to 1. And I know that because I specified that here.
So, for something to spin all the way around, I need to go from 0 to 360. And now it will zoom all the way out because it's going from 0 to 360. So I would want to break this connection-- reset the position back to 0, so we can see it again. And then instead, hook this up to the rotation around the y-axis.
So now we have a node here, called the oscillator, which is outputting a value of 0 to 360 over time, always. And it's driving this input here, that you can see. So everything that is changing you can actually inspect.
AUDIENCE: How do you slow it down?
NOP JIARATHANAKUL: You can slow it down, just some of the parameters here. So it's doing 0 to 360 over two seconds. And to slow it down, I just do it over 10 seconds instead. How all the parameters are listed vary from node to node, so you really have to kind of explore and play with it.
PRESENTER: And if we have time, I will show, we have animation editor that is much more sophisticated. But these are really basic principles to use oscillator node whenever you want something to be animated.
NOP JIARATHANAKUL: OK, and now I can change some of the materials here. So I can search for material. And there is a whole lot of materials, different types of shading.
For example, I will try the X-ray. And all I have to do is just link the material to the model. And now this has some kind of X-ray look to it. I can just delete this and it will return to the default.
The recommended material is called the realistic material. And what that is, in technical terms, is PBR, so photorealistic rendering, which you see in a lot of applications these days. So this is the recommended and the default material.
So when I hook it up, nothing would change because it's the default material already. What I can do is come in here, and now I can change all the parameters. So you have the color. So let's try to make something like a--
PRESENTER: By the way, [INAUDIBLE], that little square at the end is the color button. It doesn't look like a button. So if you wondered what he clicked-- can you should do it again, Nop?
NOP JIARATHANAKUL: Yes. I just click--
PRESENTER: That little thing there, and usually it's gray. So you will not notice. We have not yet started designing this, the UX and stuff. We're still working on the technical aspect of this.
NOP JIARATHANAKUL: Right. So I can change the RGB directly. Or I can use a color picker. That's all. That's the difference.
So I make it yellow. I can increase the metalness, so then if something that's shiny. Make it something like gold color of the actual logo. And we can actually change the color of the floor to go along with that complementary colors.
So this one actually has a checkerboard on it because I have applied texture to it. Now the texture is not there. So there's nothing for you to see.
So I need a texture, which I actually haven't uploaded. So we can see this happening, live. So I'm going to go into my finder. And we'll try to find some textures that I already have. Let's see-- 3D assets, we have a bunch of textures. Here.
OK, so this stone texture looks good. So all I have to do is drag and drop into our folder. And you can see that it's uploading and then converting. And after a while, it should be done. I think it's already done.
So I can just hit refresh. And once the status goes away, it's ready to be used. So you see the texture node right here, and all I had to do was drag and drop. And it should change the texture now, to that stone texture.
I will reset this back to the gray. And now we have a different texture. So that's how easy it is to use your own assets. OK.
PRESENTER: So, Nop, you mentioned when you dragged here, in the model node, that obviously it could be dragged and dropped anywhere. But depending on where he dropped the node, it is a sister, or at the same level with the other models in this scene, or dependent on it. Can you maybe talk about that, about the dependency here?
NOP JIARATHANAKUL: Oh, here-- yes. So, yeah, this is similar to, again, something like you will see in Maya or Fusion, with all the hierarchy. So, just to demonstrate a point in that, I will grab a transform. A transform is how you locate a 3D object in space. So again, it just gives you parameters, position, rotation, things like that. But what you can do with the transform, is you can group two objects-- or more than one object-- under it.
So now I have put the model and the rectangle together. And now I can move them as one. So that's just a simple concept. If you've used 3D software before, this is very natural to you.
PRESENTER: So, depending on where you place it on the hierarchy tree, you can either impact that one object, or if you wanted a couple of objects to be impacted at the same time, in scale, in position, then you organize the tree like that. It's simple drag-and-drop, and you can always reorganize later.
NOP JIARATHANAKUL: OK, again, Play is interactive 3D software. So let's just see some of that, as well. So, right now I can't interact with the scene except for rotating the camera. And that's being driven by the orbit controller.
We don't have to worry too much about it, except that it takes care of the camera positions as you interact with the mouse. And you can see these numbers are changing. And that's what's driving the camera.
So let's do something with this object as I click it. So how do I make that happen? So, as I click this object, first of all, I have to make it clickable-- or we call it here, "pickable." So before, if it's not true here, nothing happened as I click it. I'm clicking it now.
If I turn it to true, you can see something is actually happening. So what you're seeing is that there are events that are firing as I click on it. So this is mouse down. It says by begin. This is equivalent to mouse down. And then I mouse up. So that's the end, so that you can have different interactions. And also active turns to true and false.
So, what can we do with this? So I have another oscillator lying around here. So we can just say, how about if whenever I click on this, it will grow and shrink, just to show a simple concept.
So let me first get that animation correct. So I will hook this value up to scaling. So now it just keeps growing constantly. We don't want that to happen.
So let's go from 1-- oh, I had it at half, maybe. I don't remember. So it's going from-- let's just make it 0.2. OK, and also we want it to run once, only. So now we-- repetition is 1. So now, it's staying there.
Actually we want to grow, and grow up, and then grow back down. So we do alternate. And we actually need two times. So every time now when we start, it should go up and then it should go back down.
So that's a little bit too large. So let's do 5.5. So then it should expand, and then it would shrink back. So that's what we want to happen.
But right now, I'm hitting the start button to trigger the animation. But I want to hit the actual model to trigger that animation. So I will grab the output that activates as I click. So I can just do begin, which is related to mouse down. And I will drag it here into the start.
So now it established another connection, and you can right click to examine that. And it says, pointer begin goes to the start. And now, whenever I click it, it will send that event to the Start button that I just hooked it to. And now it will grow and shrink.
The animation here is a little bit boring. So I can try to speed it up and use a different curve. We have tons of different curves we can use here. So let's try exponential. So now we have some smoother animation. And it's just as easy as that.
So now we have some interactivity. We're looking at a 3D scene. And now we can also interact with that scene.
PRESENTER: So when Nop talked about real time, everything that he was doing was updating automatically. And at any point, if you click [INAUDIBLE] Publish, this is exactly what the customers will see on the web.
NOP JIARATHANAKUL: And we can try that now.
PRESENTER: Yeah.
NOP JIARATHANAKUL: So, trying to publish, I hit this Share button. The Publish dialog box popped up here. And it knows that it needs the scene, the object, and this image that we have on the bottom. So I will call this AU--
PRESENTER: Sorry screen is chopped off a little bit.
NOP JIARATHANAKUL: --2016-demo1 So, and I hit Publish. And it's doing all the magic that needs to happen. And you see it's captured the screen, and it gives us a link. So I can just click on this. And it opens a scene that we just saw. And we can interact with it like we did.
And just to make a point about this, I'm going right now on my phone to prove that this is immediately live, anywhere to the public, and can be used anywhere.
PRESENTER: Can you go back to Play, show the dialogue box?
NOP JIARATHANAKUL: Yup.
PRESENTER: Yeah. So you can see here in the dialogue, we have hyperlink that you can send to anyone on a phone, via email, and they can experience it-- or embed code, HTML for IFrame, so that you can put it on your website or blog or something. It's that simple. You just need that one link.
NOP JIARATHANAKUL: So, just, if everyone can see, I just loaded it up on my phone. Or you guys can actually try it. I see some people are punching that in. So, the name that I just put there-- AU2016-demo1-- is the name of your link. So you can customize however you want.
And now I have it open on my phone. And it works just fine on Touch. And I can click it to make that animation happen that we just did.
[SPARSE APPLAUSE]
[LAUGHS] Thank you. All right, so can we try some more complicated models?
PRESENTER: Yes, maybe you can talk a little bit about how to put a sphere or a surrounding?
NOP JIARATHANAKUL: Oh, yeah. So, for the background?
PRESENTER: Yeah.
NOP JIARATHANAKUL: OK.
PRESENTER: We will then move to different file formats, imports--
NOP JIARATHANAKUL: Yeah, OK.
PRESENTER: --explain what works to today, with doesn't, Revit, VR.
NOP JIARATHANAKUL: And is the pace that we're going OK? We're trying to explain things very slowly, but there is so much that we can do. And let us know if you'd rather us move slow or move much faster. Because I can move really fast.
PRESENTER: I will not let you.
[LAUGHTER]
NOP JIARATHANAKUL: Yeah. But there is so much to show here.
PRESENTER: This is not a training session. It's not easy to train you in one and a half hour on this. We want to excite you for what's coming. We have beautiful materials online, lots of videos, tutorials. Get started. We'll help you get going.
NOP JIARATHANAKUL: OK, so let's just quickly put a background to this scene because black is very boring.
PRESENTER: Can you stop it from moving? It's nauseating.
[LAUGHTER]
NOP JIARATHANAKUL: Oh, stop this? [LAUGHS] OK.
All right, so what I just did now is I want a box. So I can go to geometry, as well. So these are all the things that you can actually add to your scene now. So if I just scooch it to the side, I can add any-- so again, we said you should create most of your stuff outside of Play and then bring them in. Well we do have some geometry here to help you, just basic geometry shapes.
So what I need, I want to put an environment background to this whole scene. So I will put in a box first. So I have a box.
And I will actually now put some settings on this box to flip it inside-out. So you know in graphics, back-face calling, when you're inside something, you can't see it. So, a bit technical, but all I'm doing here is flipping that box inside out so that we can be inside it.
So now, I will go to this box and make it really large, 100 size. So now we're in the box. There is a material here call the background material. And all I have to do is just connect that. And it's actually pulling from our environment lighting as the background. So I can go through the view port settings here. And we have just a few already pre-canned environment for the lighting. So if we choose Riverbank, you can see that the background changed, and also the lighting on this changed. So it will correlate with your seeing correctly. We plan that this is also something that you can import. A lot of you here are probably interested in photography, capturing HDR environments. And that can be important here, as well-- in the future.
PRESENTER: But you can map photos. You can map videos. The object could have been cylinder or a sphere. And you can map on it any type of photo or video that you want as a background, resembling 3D.
AUDIENCE: Question [INAUDIBLE] I don't know about the rest of you, but I'm totally convinced. I mean, I love it. How can we move along with you on this and know when it's going to be ready?
PRESENTER: Oh, I will cover that at the end. It's available today on Autodesk Labs. You can play with it today. And it's totally, obviously, free. And after you went through all the materials, you're very welcome to-- we have a forum, as well, and we love to hear from brave people to start using it.
NOP JIARATHANAKUL: OK, so let's actually try-- oh, sorry. Yeah?
AUDIENCE: How flexible are your shaders for adapting them to scale?
NOP JIARATHANAKUL: In terms of performance of large--?
AUDIENCE: [INAUDIBLE] in terms of size. Like, we have [INAUDIBLE] but how flexible are the shaders when you manipulate them in way where they're assigned into a larger object? Could you still tie them to a larger [INAUDIBLE]
NOP JIARATHANAKUL: So the scale of the object should not matter to the shader. The complexity and the amount of geometry might matter. Generally, the engine can handle up to a quarter and half million polygons, easily, with this PBR shader.
AUDIENCE: So if you have a brush shader-- sorry, can a brush shader be like, a scaled brush [INAUDIBLE] texture [INAUDIBLE] for a larger-sized object?
NOP JIARATHANAKUL: You're talking about the size or the texture?
AUDIENCE: Yeah, I have an aluminum brush texture.
NOP JIARATHANAKUL: Right.
AUDIENCE: Would you be able to shrink it down for [INAUDIBLE]
NOP JIARATHANAKUL: Yes. So I can just quickly demonstrate that.
PRESENTER: And by the way, good question about size in general. When you import models, 4K in the texture is a maximum. That's WebGL, really. And for geometry, 150K is recommendable. It can read more. Again, this is not our limitation. It's the web. But it might slow down the scene if you have really heavy geometries.
NOP JIARATHANAKUL: Yeah, and on that point, really, the goal here is to not deliver the highest-quality visual quality possible. Although it looks pretty good already. I can show you a much more complex model. But the point here is showing something that looks really good, but good enough, so that you can show your clients on your phone-- a tiny phone, as well as a large screen projector like this. So, and you're going towards--
PRESENTER: And he's being very modest. The scenes are actually beautiful. You will see, so.
NOP JIARATHANAKUL: Just the size of the texture, quickly-- I just attached a 2D transform to the texture map coordinate. And I can scale this down or up.
PRESENTER: Was that the question?
AUDIENCE: [INAUDIBLE]
PRESENTER: Awesome. By the way, there are chairs free here, if you don't want to stand. OK. Yes?
AUDIENCE: Quick question. Let's say if you have a building, and you have three interior rooms. Do all load at the same time? Or can you, you know, one's for one room, but [INAUDIBLE] and when we go to a different room--?
PRESENTER: We will come to spaces a little bit later. So, we'll come back, yeah.
NOP JIARATHANAKUL: The answer is, you can do both, depending on your needs. The system gives you that flexibility. OK.
PRESENTER: Oh, there is another question.
NOP JIARATHANAKUL: This is great.
AUDIENCE: [INAUDIBLE] Right now, we need [INAUDIBLE] Autodesk, [INAUDIBLE]
PRESENTER: Yes. It's always your one and only Autodesk account. So, once you sign up, we approve it. And then you have access.
AUDIENCE: And where is the files are stored [INAUDIBLE]
PRESENTER: It's on the cloud. Currently, it's all on Autodesk servers. So, at the moment it's proprietary, just because we're still working on it and--
AUDIENCE: [INAUDIBLE] something when we go out there we can access?
NOP JIARATHANAKUL: In public.
PRESENTER: The question was, is it a shared location? So, it's not public. Nobody else will see what you're doing. So if that is a concern, that's for sure not the case.
Was that the question, or was it something else?
AUDIENCE: Yeah. So you create [INAUDIBLE] confidential, and you want to share with [INAUDIBLE]
PRESENTER: You can share it confidentially, and nobody else but the people that you shared with can see it. Yeah. And I will show you later with the templates, when we have product-related templates like for ReMake, again, you publish, you can say it is private sharing or public sharing. So it's possible. OK? Yes?
PRESENTER: We have one more.
AUDIENCE: Is there an API that [INAUDIBLE]
PRESENTER: Not yet.
PRESENTER: Not yet, right?
NOP JIARATHANAKUL: An API for?
PRESENTER: [INAUDIBLE]
AUDIENCE: I want a dynamic with [INAUDIBLE]
NOP JIARATHANAKUL: Sorry?
AUDIENCE: I want a dynamic with [INAUDIBLE]
NOP JIARATHANAKUL: So, right now you can do that by exposing an API on your side. So we can react to APIs.
PRESENTER: Yeah. That was the part that--
NOP JIARATHANAKUL: So, if you have, let's say, a bunch of data sitting on your servers, you can expose that as a public or private endpoint that you only know about. And then you could write an API call that talks to that from Play. So, you saw that example with the Google Weather.
PRESENTER: Or you can have tweets coming and visualized, a feed of tweets or whatever you want, coming from the web.
AUDIENCE: Yeah, I'm thinking more dynamic [INAUDIBLE] But I think you could do it, just by what you were talking about with the dial on the radio, right. It's the same kind of [INAUDIBLE]
PRESENTERS: Yeah.
NOP JIARATHANAKUL: Do I have another question over there?
PRESENTER: Question?
AUDIENCE: [INAUDIBLE] Can you have a more [INAUDIBLE]
PRESENTER: You need to speak a little bit louder also, because it's recorded. Did you hear the question?
NOP JIARATHANAKUL: So, is the question can you add custom shaders?
AUDIENCE: Yeah. Can you put in a shader library [INAUDIBLE]
PRESENTER: Oh, a shading library.
NOP JIARATHANAKUL: OK, right now we can't do that because shaders are pretty much one-- it's specific to the engine. Eventually, we want to expose an editor. So if you do know how to write shader language, you can add your own shader.
And same with the node functionalities. So we want to actually expose custom functionality for your nodes, so you can just write JavaScript. We cannot possibly provide all the building blocks that you need.
PRESENTER: So you can create your own nodes in the future, et cetera. Definitely part of the success of this, as we see it, would be opening it so that some users write new nodes, others need new nodes, and buy from them, et cetera. Yes?
AUDIENCE: Would you be able to create new nodes with Dynamo and then [INAUDIBLE]
PRESENTER: The systems are not connected at all yet. As it happened, we were developing them in parallel. And Dynamo is fantastic. I just recently completely fell in love with it. It's more oriented today towards automation and tapping into Revit. But, of course, we are looking at, are there opportunities for the node-based systems either to be combined or leverage one other. And we will be talking with the teams.
AUDIENCE: [INAUDIBLE] with this interface [INAUDIBLE] It seems like there was some conversation on [INAUDIBLE]
PRESENTER: Oh, yeah. Yeah. And we're more focused on what we call Show, Tell, Sell. So it's more branding, marketing, instructions, assemblies, visual content. Dynamo at the moment is really doing some hardcore math and stuff, which is very powerful on its own. OK. Let's move.
NOP JIARATHANAKUL: OK, yeah. And if we want to move, we can take input on what we want to see. But we have a little agenda that we're just kind of working through.
So, now I would show how to import a model with some textures and more complex models. So the one that we just imported wasn't an OBJ model without any textures. So we have another node, called a bundle node, which will construct more nodes for you automatically, from the model that you imported.
PRESENTER: Basically, if you import the [INAUDIBLE] it will reconstruct the entire [INAUDIBLE] here, in one bundle node.
NOP JIARATHANAKUL: So, let's just do a simple example first. So, if anyone has seen the COLLADA duck, so it's just a test asset. So, here we have the duck here, in a DAE file. It looks like this on my computer. This guy, so you've probably seen this guy. He has a texture.
So all I have to do is drag the DAE file into the bundle node. And then it's just one step. And he will show up immediately.
What's actually happening on the inside, is actually, this is now another concept, which we would like to show and make a point about, which is what we have seen so far are individual nodes. But we actually have another citizen called a graph, which is a container for nodes.
So this is actually, secretly a graph. But it looks like a node on the outside. So what I have to do first, is make it editable because right now it's smart. So we have to kind of make it not dynamic and more details later, if you're interested.
But, OK, so now I can go into this. I just made us a graph. So I can double click and go into the graph. And you can see that there is another level to this-- so, kind of like a little folder with more stuff inside.
Actually, so now you will see that you're now seeing the nodes that you are familiar with. But it's now being constructed for you automatically with the texture, and all the texture maps, and everything. It will find automatically the assets, the model, and the texture from your folder structure here.
And if I just want to quickly add, I've prepared a little sound byte for the duck. I should have the actual-- yeah, I have audio. It's just a duck sound.
PRESENTER: So just to explain, the OBJ file is separate from the texture file. So, usually when you read OBJ, the texture, you have to re-map it. The COLLADA files work really well because they already contain the texture, and it's automatically readable.
NOP JIARATHANAKUL: Yes. Yeah, this is not like-- you might be familiar with FBX Revit formats, where the textures are actually embedded in the files. But most 3D formats, textures are separate.
[QUACK]
Oh, do I have sound? Oh, here.
[QUACK]
So this is a little quack that I can play once I'm clicking on this duck. So, again, I'm working at the level of this graph so that anything inside here is related to my duck system.
So, again, I'm going to go here. I'm going to work a little bit faster. Make it clickable. Fire some events. I'm going to look for audio. Grab the audio file, very simple. Again, we've seen this. Mouse down to Audio, Play.
And when I click here, the Play is firing. But we don't have a audio file yet. So I'm going to grab the quack file, drag it into here. And now--
[QUACKING]
Simple as that. And then, I can add a bit of an animation, as well, to it. But maybe we don't have to do quite yet.
PRESENTER: No, I'm just looking at time because we have so much more to show. Can you maybe start showing the import the [INAUDIBLE] or linking to Twitter, or to something that is external source?
NOP JIARATHANAKUL: Yup. So I have a more complex scene here, with some lighting, some cameras. So you can see here, there's three lights, there's a camera on the scene, there's a camera that's attached to the bunny, as well, called the Bunny Cam. And this is the hierarchy.
So I have exported that using COLLADA. And I would just jump into a fresh new scene, here. Oh, don't forget to save, always. The software is a little buggy, so whenever you're working on this and run into a problem, always just hit Save and Refresh. It will fix itself.
OK, so again, I'm going to go to a new scene and delete everything that I have here. Grab this bundle node.
PRESENTER: So, bundle node is usually when you don't import just one object, but multiple objects at the same time or scene.
NOP JIARATHANAKUL: Yup. So I have my share scene in a folder, nicely. I will look for the COLLADA file, and it has a ton of textures. I will grab this file, and then you will see that it magically comes in, everything as you see.
PRESENTER: So basically, it reconstructs the entire [INAUDIBLE] all the geometry, the lights, the textures, et cetera.
NOP JIARATHANAKUL: And I can turn the environment lights off because we give you, by default, that HDR lighting. And if I turn that off, you can see that it's reflecting exactly that scene that I have here. And I can dial into here. And you can see that this structure perfectly matches what I had in the [INAUDIBLE]
So, once we build out this path correctly for Revit and Fusion, you also get the same thing. So imagine, you have a car that's articulated. And you have built out the parts correctly. Then you will get the same grouping and articulation, so that you can bring it into Play animated and then tell the story about your model.
So, just to show a little bit of how powerful this is, I have the chair group-- a little bit dark here.
PRESENTER: Maybe can you put some lights on?
NOP JIARATHANAKUL: Yes. I can dial down the environment lighting, so that we can see more of the lights as well.
So I can grab the chair, for example, and then move or animate the chair as a group because I've already constructed it that way. Or there's actually a camera and some lights, so I can actually switch to the camera that came in from my scene. And it's listed here. So and we switched to the light cam.
So now, I actually can't control my scene anymore because I don't have the controller on the light camera, for example. But then, I can go to that group in my scene. And I can actually animate it this way. So then, now the camera is moving together with the lights I have constructed. Just a few use cases.
And so that's that. I just want to show one more model. And then we should move on from this topic.
PRESENTER: And maybe Twitter? And you don't have to make it in front. You can open the scene.
NOP JIARATHANAKUL: Yeah. So I just want to show off some visual quality. So, again, we are sister product to ReMake. So that's our first target that we were hitting. So this is actually a photogrammetry model, captured in ReMake from photos. Import it into Play. And these are all the materials that have now been correctly applied.
This is, again, the scene that you are familiar with. Let me actually pull this up here.
PRESENTER: Yeah, by the way, the UI, you can move it around however you want. You can reorganize the dialogues if you want a screen bigger.
NOP JIARATHANAKUL: So then, we see it a little bit better. OK, so we have a few more maps going on. But this is with all the layers combined. So I can turn them on and off, as well. So I can go to-- oh, but now I don't have my inspector. OK.
Turn on and off the color. So that's the diffuse layer. So you can see that we also have ambient occlusion. We support that-- metalness, glossiness maps. So that if I turn that off and turn this down, you can see that that's the base model.
But with all the correct layers applied, you can get really complex look. And you can simulate-- we like to say, 99% of what you want to make it look like, you can if you apply the textures correctly. And if any of you know the language or use Unity or Unreal before, we use the same roughness, metalness occlusion map systems. So any assets you create for those engines, we can accept immediately.
OK. So I think we should probably move on.
PRESENTER: Yeah, I think you're right.
PRESENTER: Maybe you can switch to six?
NOP JIARATHANAKUL: Yeah.
PRESENTER: We have so much to show you, it's really just the choice of the things.
But, OK. So, forms of access. What you just saw is the most difficult layer. This is the editor. Those of you who are comfy with node-based editor, you might already try it. We tried also to provide pre-templated combinations of tools that you don't have to create with a node-based graph, but just use them. But you can extend them, too, with the editor to do more.
So, usually for any workflow, the data prep of the assets is always the painful thing. And Nop showed you, for example, any type of mesh that comes from Maya, Max, Blender, any type of tool. At the moment we recommend COLLADA file, DAE, because it contains also the texture.
If you deal with OBJs, as you saw, you have to remap the textures. It's possible. But it's one step more.
And those are the limitations on the texture size and the geometry. And then, as you saw in Play, you can also load a zip file. We read zip files. So if your scene is saved as a zip, you can do that as well.
In general with CAD files, with stuff that you use, Fusion, or Revit, et cetera, it can be that the files are enormous in terms of number of components. Do you need all of these components in Play when you tell a story? Maybe not.
So for example, this is a car. It has hundreds and hundreds of components in Fusion. So here, in this short video we're just showing basically, you can start regrouping them in a lesser number of parts, depending on what story do you want to tell with the object in Play. If you want to just animate the wheels, or if you want to change colors to just the chassis, maybe you can end up with just three parts.
These parts, at the moment, should be done in the CAD app. In the future, we intend to cover this part as well, in the browser. So we can load any model. And then you can regroup and make it a simpler model, so you don't get lost in a thousand nodes, but only the number of nodes that you would need.
And once, for example, here in Fusion, you reduce this to lesser and lesser number of parts, you then start exporting them. In the Fusion case, unfortunately, SSDLs that does not have any texture or color information. That's a limitation of the Fusion Expert today. So you can remap colors once those parts arrive in Play.
AUDIENCE: [INAUDIBLE] that one you just explained is the workaround for now? [INAUDIBLE]
PRESENTER: Yeah. This is all workaround for now. We will be doing this workflow seamless, so you don't have to do that.
The templates are our way to approach the Play technology to you today, with one click. So we have product-specific templates, which we started to do first with the Autodesk ReMake, which was used in combination with this technology for the Smithsonian. But we will be continuing doing this for other products, as well. And we have generic templates directly from Play.
So in ReMake, you have the model. And then you click-- I actually have ReMake open, but I don't think we'll have time for all of that. Basically, you open ReMake. This is a chair that has been reconstructed out of photos. It's a 3D model. You click on Publish.
When you click on Publish, it brings you to an intermediate Play environment online. You can decide the background, the colors. You can decide between three different templates and just publish. It will appear on the Autodesk gallery.
So this is the process. You're in ReMake. All you need to do is publish. When you click Publish, what do we do? This is a million polygon mesh, created through photogrammetry. With a click-Publish, we decimate it. We baked the textures. And we publish it immediately in Play.
This is now the Play environment online. And we have three templates to pick from. At the moment, the first template is mimicking exactly what's happening in ReMake, the same shaders, everything. And you just click Publish. And it will end up in the Autodesk gallery, from where you can share it with anybody in the world.
So I'm going to speed up all these videos. So you click here on Publish. It goes to the Autodesk gallery. And here, using this button, you can then send either a hyperlink or a HTML-embed code to somebody else.
The second template is more a template where you can also measure on the [INAUDIBLE] in-browser. So this is, for example, a drawn capture of a landscape-- again, a 3D model made with ReMake. You click on the published template. Again, it arrives in that Play intermediate environment. And I will pick the second template. And how do I do that, is just simply you just drag and drop the second template the has measure and stuff.
Here I change the color. I want white. Or I want gray as background. And I will speed up a little bit further to show you, for example, I want to measure something. So I'm preparing the template to first see what will the user see.
So I click on the tape measure. I click here. It measures. But those pins are huge. So I can, on the left side, just change the pins before I publish the template, so that then users, online, when you share the models, can online measure or slice the model, as well.
So in this case, I'm reducing, on the left side, the size of the pins. And then, again, I publish. It to be published, again, to the Autodesk gallery. And there, in the viewer, users can do what you did here, basically measure in-browser or slice, et cetera.
The third template is the interesting one that I showed, that you can compare, in-browser, two models. Again, in ReMake, click on Publish. It publishes these skulls.
Now, what I will do here is, when the skull is published, here you will see the ID of the scene. It's a six-digit number. In the future, it will be a button. You just need to remember that number.
So you select it. Copy the number. Publish the scene.
Then open the new skull in ReMake. Publish this one. And then add-- you use the third template, which is compared two scenes. Now this skull is in both of the scenes. And now here, on the left side, under ID, I will copy-paste the ID, the numbers of the previous skull. And it will show up "automagically" here.
If your models were in correct scale, they will show up in correct scale. If there were not, you can still fix the scale on the left side in the Play browser. And now that the models are in the correct scale, you can change the visualization to, let's say, mesh mode. You can define the color of the mesh.
And finally-- I'm speeding up because we have a lot to show-- by clicking this button, they can be compared at the same time. And again, it's the same thing. You click on Publish to Gallery. This is the first in the world tool that allows you to simultaneously, in same scale, same orientation, compare two models in the same frame and overlapped over each other. By the way, you can change the visualization to be stronger. This is just the choice that I made at that time.
Now we come to VR. Yes?
NOP JIARATHANAKUL: One thing I want to say about this workflow. So we actually, if any of you out there are a developer or you have your own platform, the workflow we just showed, the integration with ReMake is just our first.
PRESENTER: Yeah. This is how it's starting.
NOP JIARATHANAKUL: Eventually, next we're probably going to hit Fusion and Revit. But if you have your own software, this is all an API. So we have a whole system that lets you give us your assets, go through our own pipeline, and then export back to your own galleries. So if you are interested in that kind of work with us, please talk to us. Very more than happy to collaborate on that.
PRESENTER: Yeah. So, speeding up here. VR experiences-- we continue to work with the Smithsonian and six, seven months ago, they needed to prepare the Apollo for exhibition. And for the first time, Apollo 11 was going to come out of that Plexiglas cover. And we said, yes! This is a time to digitize the Apollo.
So we digitized the heck out of it using scanners. And nobody was allowed to get inside. So you were putting the scanners, and the cameras, and the rigs on stilts, on sticks, et cetera. And we managed to do a beautiful 3D digital replica out of those multiple inputs with using ReMake. So we used laser scanning, structured light, photogrammetry. And we created this beautiful 3-D digital model.
Obviously, the idea was not only to experience this online, but to be able to see it inside. And by using VR goggles, you can. This is actually the original astronaut, that we also showed how people today can experience what he experienced. And you can find the scene if you go to the Smithsonian Explorer, which is at 3d.si.edu and click on the Apollo.
You will see, under the screen there is a pull-down menu-- VR experience. It's a hyperlink. Just send it to your phone. You can see it on the phone. And we have given this morning, but there are more Cardboard that you can see them through.
Now, that was simple. That was just publishing the interior. It was literally with one click, as you could see. But what if you wanted to more to the interior? Not just be in the interior, but add the tags, add the hot-spots, hot-zones, teleporters, et cetera.
So we made an example that is on the website that I showed before. It was a Revit building. Now, can you import the entire Revit building inside? Of course. But do you want to do that? It will have millions of parts. You don't want every door and every-- you know, depending on the experience you want.
So we feel that, actually, that there is a much better experience to do, which you can also test at our booth-- to use connected stereo panoramas, done with our cloud render. So, you're in Revit. You have your 3D views. You place cameras wherever you want. Let's say you want an interior experience. You place one camera here, one camera here, a camera here.
You go to cloud rendering. And you can do cloud rendering either first a still image, and then once you're in the cloud rendering UI, you can export them as stereo panoramas. Or directly from Revit, you can cloud render them to stereo panoramas.
Always do it first in standard quality because it's free. Once you're happy, then you go to the best quality. But standard is free. You should not immediately go to best and use your cloud credits.
So now, you will do stereo panorama in 1,536 pixels. And you download them. When you download them then, there is a step to prepare before you put them in the VR template that we made in Play.
Basically, you use ImageMagick. It's an open-source software. It's actually a command line. You just run two commands.
What it does, it takes these panorama pictures and splits them in six squares, which is the number of sides of a cube. Basically, it creates two cubes for the left eye and for the right eye, so that then, using the Play template, you can have that amazing experience that I showed immediately in Play.
[? NOP JIARATHANAKUL: ?] Tutorial for all.
PRESENTER: And we have tutorial. We'll show it. Well, let me maybe quickly show that. So when you go here, in Play Docs, you arrive first to the sample files that I showed you. But if you go to templates-- oh. Why is it not showing?
PRESENTER: Go back to your PowerPoint. [INAUDIBLE]
PRESENTER: Oh. OK, sorry. Yeah, so this is our Play Docs site, where we have a bunch of tutorials. First, you will see all the samples that we showed. Every sample has an ID number. And Nop will show you how to use that ID number to open any of these scenes in the editor and see how they were done.
Some are overwhelming. It will look like a forest of nodes. But you will start getting the idea. Start with the simple ones.
But under Templates, if you go to, for example, VR Panorama Template, it explains the process. And you load this template. And then you just put your panorama pictures. And all of a sudden, it will be in your environment.
And then, you can combine to that panorama exterior and interior 3D models. You can load external chairs. You can combine the 3D and panorama to look like a full, 3D experience, which is what we showed. Is this also a 3D scene? Yeah.
And you see this little icon. That is for the VR. The previous scene that I showed you had a couple of markers and icons. So if you click here, it teleports you somewhere else.
What's important is that in the previous scene-- where is it? Let me show you. You did not need to click on the phone on anything. You stare at the arrow, and it knows that you're clicking it. So this is important. You don't need to find a way to click on your phone while you're looking at your Google Cardboard or the DODOcase. But we actually designed the experience in a way that you just stare at the-- so let me open it again.
On the web, obviously I will be clicking. But if you stare at this, then it will do what the click on the web would do. If you stare on the night scene, or the lights scene, it will change the lighting. Here, it goes on somewhere else.
So, to finish up here, and then with the rest of the time, Nop can show more. Or you can ask questions. Publish and Share-- you can hyperlink or HTML-share. Everything is privately shared at the moment, directly from Play. And the ReMake, it's just like what you do on the Autodesk gallery. Every model can be unlisted for private sharing or not.
Galleries-- so, we worked on a couple of projects. This was a super exciting one with the Factum Foundation in the Royal Academy in London. We made this project-- uh, god, sorry. It's not here [INAUDIBLE] The hyperlink is not working directly. Yeah, it's so stupid. It opens-- I have to do this. It's really strange. So I'll just copy-paste this to show you.
This is just one gallery that we did. Every one of these models is actually a 3D model. So you click on it. And this is [? how, in our museums, ?] this was a project to show how 3D life portraiture today looks like. And, you know, these are all 3D models. It can be your buildings. It can be your objects, whatever you want. But you can build these galleries.
And we created a batch uploader in case you have really many objects, so that you don't publish them one by one. It's not yet out. But we have that.
So from Where is what-- we have the play.autodesk.com. It's the website. You sign up for it on Autodesk Labs, actually. You go to Autodesk Labs. Look for Play.
NOP JIARATHANAKUL: Can I add one more thing quickly?
PRESENTER: Yes.
NOP JIARATHANAKUL: Because they're taking pictures.
PRESENTER: Yes.
NOP JIARATHANAKUL: There's actually autodesk.com/vr.
PRESENTER: OK, if you go to--
NOP JIARATHANAKUL: It's an easy link, which [INAUDIBLE]
PRESENTER: OK, autodesk.com/vr, you'll also end up quickly there. But in general, this is the place where we have all the tutorials, beautiful material, and all the templates, to already use them without coding on the node-based environment, and just exchange the assets. And we have templates in which we are already prepared tags and stuff, you'll see.
This is the editor. Once you sign up on Labs, and we give you access, it's your Autodesk address. And you just go there. And these are our email addresses in case you want to contact us, et cetera.
But if we have a little bit more time, we have so much more to show live in the software. But do you want to see something more specific? Otherwise, Nop can just continue. There are so many different types of models to show.
PRESENTER: Do you think we should [INAUDIBLE]
PRESENTER: Yeah, we should do questions.
PRESENTER: Yeah, do you have questions? To continue with questions. If not, maybe you can show the Twitter external link. Yes?
AUDIENCE: Yeah, how do you think this would work with [INAUDIBLE]
PRESENTER: Yes, that is what we explained. So you can export from Revit and FBX. If you really want to deal with the entire 3D model, at the beginning it might be a little bit challenging because there will be lots of nodes and parts to deal with.
Today, we recommend you make cloud renderings of the interior, of the exterior. You load them as stereo panorama. And did you have a feeling that you're not in 3D when I showed this scene? It feels like 3D. But then you can import 3D models of the furniture if you want to do something special with it, et cetera. So you can combine to 3D data and panorama.
In the future, the Play team and the LMV team are getting together now. The LMV team does exactly the opposite. It can read original files from our software without converting to mesh. So the future is for us to read original Revit files without any conversions, so then you can just regroup, do whatever you want. You don't have any material or texture issues, et cetera. That's absolutely where we are going.
But we believe that today, for big objects like buildings interior or exterior, the cloud rendering is also a very, very valid solution using the stereo panorama. So it's up to you. It's possible.
But today, because we don't have directing part of Revit files, you might find yourself having to remap some of the textures. And that's mainly because the FBX has an issue, not Play, not Revit. It's the FBX converter. Question?
AUDIENCE: I was going to kind of layer on some [INAUDIBLE] 3D interior experience?
PRESENTER: Yeah, so you didn't show 2D data. You can import images, and photos, and drawings. And you can put them however you want. You can place it on a plane. The plane can be behind. Your model is here. So you can mimic whatever environment you want. Do you want to show a plane and how you map a drawing or an image? It's simple. Well, while I'm talking you can actually do stuff.
Do you have any other questions? Do go to the Play Docs site. We really, really try to explain step-by-step. There are videos. There is text. And there is an idea of every project.
So you can open any of these projects that we showed you and a little bit play with them. Try to move stuff. Try to change. The nodes are there. If you delete the model and copy-paste your model, then you can map your radio instead of our radio, and stuff like that. So you can just start to see what it feels like. Yeah?
NOP JIARATHANAKUL: Yeah, so this is a website that we created just for AU, as an easy landing page for all those Play resources. So, all you have to do is go to autodesk.com/vr. And it's actually also on a keyword code that we'll be handing out the cardboard boxes for.
PRESENTER: Yeah, but this is just for the VR parts.
NOP JIARATHANAKUL: No.
PRESENTER: No?
NOP JIARATHANAKUL: So it actually has links. So you can click this to go to Project Play.
PRESENTER: OK.
NOP JIARATHANAKUL: And hit Sign Up. Now we'll go to Labs. And the tutorials will lead you back to the Docs that we just saw. So it's just one link that has all the stuff that we just listed for you.
PRESENTER: Oh, do you mind opening Play? One thing that we forgot to tell you-- when you, for the first time, enter in Play, we provided five sample small projects. It's a little cow, that in the one demo, it spins. In the other demo, it jumps when clicked on a button. In the third one, it sings. In the fourth one-- but you can see very simple interactions, scenes already pre-prepared for you. If you can show it, Nop?
NOP JIARATHANAKUL: Mm hm.
PRESENTER: And also maybe show how to import the ID of the other projects, so they can play with them. But first, show the five scenes, where do they show up.
NOP JIARATHANAKUL: Yeah, so this is, when you land in Play, this is the first project that we give you.
PRESENTER: This is how it will open for you.
NOP JIARATHANAKUL: The tutorial project. And they're numbered, so you can just double-click on them to open. And they're kind of baby steps.
PRESENTER: But you can see the graphs that was needed to create each one like that.
NOP JIARATHANAKUL: Yeah, they're usually very simple.
PRESENTER: Yeah, so if you click on the second one, for example, click to spin-- you'll see there is an additional node model that is for the little box. And then down is the experience, how it was coded. So when you click on the box, the cow does something.
NOP JIARATHANAKUL: So I can take this one.
AUDIENCE: [INAUDIBLE] the external [INAUDIBLE]
NOP JIARATHANAKUL: OK, yeah.
PRESENTER: Can you open that scene? Yeah, yeah. No, no problem.
NOP JIARATHANAKUL: Yup, yup, yup.
PRESENTER: And as said, you can open it also at home and see. But yeah, we kind of came very late on board for AU so we could squeeze this session, but not a training session that would have required a couple of hours. But we will do webinars. So if you're interested in signing up for webinars, please leave us your email addresses, or just join on the web. And then we will do a more in-depth webinar.
NOP JIARATHANAKUL: So just now, it was a live tweet. So, what's happening, is every 10 seconds it's polling for the hashtag #AU2016. I changed that just now. So if I change that to just #Autodesk, you know, the text here will change. And then, something is happening here. So this is actually now coming in live. So if right now, you tweet something, it will keep coming up.
PRESENTER: You can try. [LAUGHS]
NOP JIARATHANAKUL: Yeah. So this is real.
PRESENTER: And do you mind opening the scene with the formula so that they just see the hook up with the website?
NOP JIARATHANAKUL: The Twitter? The Twitter is just one node, actually.
PRESENTER: No, not the Twitter. The fusion formula car. The scene with the web link.
NOP JIARATHANAKUL: The scene with the-- you mean this one?
PRESENTER: Yeah.
NOP JIARATHANAKUL: Oh, OK. So this is a different connection. So what we saw just now was Play talking to third-party APIs.
PRESENTER: Yeah, this one, for example, as well. Yeah.
NOP JIARATHANAKUL: So this is talking to third-party APIs, maps, weather, Twitter. So then, there is another way to communicate with Play is, let's say, imagine this is your own e-commerce website. You don't have to build everything in Play. You can build these UI elements in Play.
But Play is not the best-- so, what I like to tell people is, use Play for what it's best for, which is interactive 3D experiences. And then use the web to build UI elements-- buttons, tables-- because the web is good for that. So instead, use your web to build your shopping cart, your menus. And then you can use send commands from the web into the Play frame.
So this is what's happening. So the Play, this is the frame. It just has the car. It doesn't come with any of the buttons that you just saw above it.
Imagine this is your shopping cart. And these are the buttons that you build in your own menu. And they are talking into the scene through a different message API that there's a tutorial here about.
But with that connection, the possibilities are limitless, right. You can seamlessly integrate the Play experience, beautiful 3D interactive content with the business logic of your own website, whatever that may be.
PRESENTER: If you open the thing-- can you show just some of the nodes? No, if you open it in Play, sorry.
NOP JIARATHANAKUL: Oh, I don't have this one ready.
PRESENTER: Oh, OK. Sorry, that's fine. OK. Yeah you can open it and see how it was tied to. But that's one of the most exciting things. I spoke with so many marketing departments of your companies, and they were like, yeah, it's nice to have a [INAUDIBLE] and show something. But we want to tap into our marketing data and show this in connection with-- so, there is a lot of information on the web that is useful when you tell a story.
Oh, time, time, time. It seems like we're totally out of time. Thank you so much for coming. Sign up if you're interested. [INAUDIBLE]
[APPLAUSE]