Description
Key Learnings
- Discover how to handle Gear VR to visualize and experience 3D model, 360-degree panorama picture, virtual mock-up, and so on
- Discover the flow of converting BIM model or panoramic photographs into a suitable data format for Android
- Learn how to handle camera equipment to take panoramic pictures, and discover software to stich and publish them into exe format with information
- Learn how to export Revit files as fbx format, import it into 3ds Max, and render with appropriate materials and settings for virtual space
Speakers_few
- JKJonghoon KimPhD<br/><br/>Director of BIM Department, Samsung C&T
JONGHOON KIM: Thank you for attending this class. My name is Jonghoon Kim from Samsung C&T.
JUHEE RHO: And hello. I am Juhee Rho from Samsung C&T. Thank you for coming to our class, and I hope you guys enjoy our class.
JONGHOON KIM: So in this class we're going to present various applications of virtual reality and augmented reality virtual tour in the field of construction. And we're going to also introduce you how to generate the contents with different methods.
So this class will cover different methods to generate the VR contents with BIM models. So while we go through the process of generating VR contents with a BIM model, we'll show you how to utilize BIM software like Autodesk Rabbit and 3ds Max for visualization and rendering works. And also, we'll show you how to use game engines like Unity Pro or Stingray.
And we're going to also show you how to generate the VR contents-- virtual tour contents-- using special software. So at the end of class, we're expecting that we'll get to know how to generate the VR contents with the BIM models and virtual tour contents, and how to utilize those contents with VR glasses. Before we start, we would like to introduce ourselves and our company shortly.
JUHEE RHO: So, hello again. My name is Juhee Rho from Samsung C&T. I am working as a BIM engineer in Samsung. And before joining the Samsung group, I was working as a Digital Project Specialist in R&D Project of developing Korean Traditional House. And also I was working as an architect for several years at several companies like KPF New York and Beyond Space Seoul. And for my educational background, I went to the University of Pennsylvania for my Master's degree, and I went to Seoul National University for my Bachelor's degree in architecture.
JONGHOON KIM: And I am Jonghoon Kim, working as head of BIM department in Samsung C&T. And before joining Samsung in 2014, I worked as a senior BIM manager in VPR construction. Besides my industry experiences, I have some experience in the academia. I got my PhD degree from Stanford University, and while I was at Stanford I had participated in a couple of research projects regarding virtual design and construction.
So I'd like to talk briefly about Samsung C&T. We are providing general contracting services for three main business areas. We have building projects, like high-rise buildings, and a general building and a hospital project. And also have a civil project like road bridges and railways. And also we have energy and power plant projects. Total employees in our companies is about 7,000 people, and our total revenue this year is very close to $14 billion.
Geographically, our headquartered office is in Seoul, Korea, but we operate globally. We have 19 overseas offices, including two regional headquarter offices, one in Singapore and the other one is in the United Arab Emirates.
There are some more of our projects that we've built around the world, including our Burg Khalifa in Dubai, the tallest tower in the world. And Dongdaemun Design Plaza in Seoul, Korea. This building was designed by Zaha Hadid and had a quite complex geometry of regular shapes. So we used BIM to offer the design of the exterior skin system and and fabrication of the exterior skin system. And the other buildings-- Taipei 101 in Taiwan, and also Petronas Tower in Malaysia.
JUHEE RHO: So Mr. Kim and I both work at the BIM department in Samsung C&T, so we would like to introduce a little bit about our department. So Samsung C&T has three separate divisions. One is building and one is civil. One is power and energy division. And each division has its own BIM department. So for civil, we have nine BIM engineers, for plant we have six BIM engineers, and for building division, which we belong to, we have 40 BIM engineers.
And let's take a look at our BIM project along this timeline. Since 2009, we started our BIM process applied project. We started with Dongdaemun Design Plaza just like he said. As you can see from this slide, we gradually increased the number of BIM-applied projects. So at 2013, we had 31 BIM-applied projects, including [INAUDIBLE] in Singapore and [INAUDIBLE] training center in Korea. And up to today, we 2,164 full BIM process applied projects.
JONGHOON KIM: So these are how we use BIM in our projects. Like contractors in the United States, we utilize BIM for multi-trade coordination for the scheduling and quantity takeup, and also providing relevant information to increase the quality and productivity in the field. Also, we've been utilizing extensively laser scanning technologies like laser scanning or [INAUDIBLE] to improve accuracy and quality of our work.
So we just introduced who we are and what we do. Now it's time to switch gears and we're going to dive into VR and AR and virtual tour.
JUHEE RHO: So today we are here for AR and VR, so let's get started with the basic idea. What is augmented reality and what is virtual reality? So first, the AR-- augmented reality. It literally means that something augmented reality. So when you're looking at a certain object through the AR device-- it can be your smartphone or tablet PC-- it'll be shown as a different shape, or sometimes it would be shown with digital information added.
So there is an example of the movie Terminator. When the Terminator saw John Connor the first time, he could identify John with the digital information in his vision. This is kind of AR. But on the other hand, the virtual reality, with your immersive or non-immersive device, everything you are looking at or you feel only happens in the virtual word. So what you are looking at is digitally made information.
And there is one more concept we like to mention. It's virtual tour. It's like a similar terminology with virtual reality in terms of your experience of the virtual world with the VR device, but it is different with the data you are looking at. So for the virtual reality, you are looking at the digital information. But for a virtual tour, what you are looking at is a tour of photographs or recorded video.
And there are a couple of AR or VR examples in the field. So for a game [INAUDIBLE], this is a VR game for VR viewer. We can see many, many of the games with the VR technologies. And sometimes we can see AR or VR technology in the commercial fields or even in the educational fields.
JONGHOON KIM: So, by definition it is very clear to distinguish VR, which is computer-generated simulation and virtual tour-- it's not computer generated simulation. It's rather a simulation of an actual location, usually composed of a sequence of still images. But in this field, the game and a commercial [INAUDIBLE] and education, you often see those two contents are mixed used.
So we introduced that VR and AR and virtual tour contents are used in other fields, but how about in the construction field? Why not in the construction field? So in Samsung C&T, we use VR and AR and virtual tour for three main objects. First, from a facility management perspective, we have been using it on maintenance control and facility management. And for our clients, we've been testing it on to support their decision making. And for us, we have been using it to review constructability.
Technically, there are two ways that you experience VR and AR content. One is using just a desktop or tablet PC in a non-immersive way, just looking at the computer display. And the other way is you can use head-mounted display like Gear VR or Oculus Rift, and this is an immersive environment.
So this example is a virtual tour for facility management. So as you see here, this facility manager is looking around his facility with his computer desktop. Technically, this content has been generated just by taking photos of a completed facility and then stitching individual photos into 360 panorama view-- panorama picture-- and then linking information like a key map here. And also [INAUDIBLE] manual for product data to the panorama picture and then you can get this virtual tour. This virtual tour actually allows this facility manager to look around the facility and check as well conditions and also obtain information that he needs.
This is an example of augmented reality. Like you see here, augmenting the BIM model of MEP systems on this tablet PC helps this facility manager understand what is installed behind the ceiling tile. In this particular example, our health care user group, like doctors and nurses, are reviewing an operating room thoroughly with VR.
The reason that we did it is we let our health care user groups check to design of the operating room and layout of the medical equipment and appliances and tools and the location of these cabinets beforehand. And if they see any issues that are against their daily work procedure, we ask them to let us know beforehand. And then by doing that, we can reflect those issues in the design and we could fix prior to the actual construction.
And also, we've been using AR and VR for constructability review on the job site. So from these applications of AR, VR, and virtual reality in Samsung C&T, we have proved that we could increase our productivity and we could reduce the number of RFIs and change orders by doing this.
So these are how we utilize AR, VR, and virtual tour in Samsung C&T. Now we'd like to introduce-- we would like to go through the demo to show you how to generate those contents. We want to go through all the details, but with the time limitations we have, for the last part of this presentation we're going to focus on how to generate the VR contents with BIM models and also how to generate the virtual tour.
JUHEE RHO: So let's talk about VR types. So there are a couple of VR application stages. We categorize them into three stages. One is fish eye, one is joystick, and one is full body. We define them with which VR device or which VR data we are going to use for each type.
So for fish eye, we can observe the virtual reality with our VR glasses. And for joystick, we can navigate the virtual reality with the VR glasses and with the controller. It can be like joystick or the touch pad in the case of [INAUDIBLE] VR. And for the full body, we actually can experience a virtual reality fully. So with the motion sensor, your physical movement will be reflected to the virtual reality. So that's full body. And for today, we are going to cover these two concepts. So as I already told you, these two VR types are different with which device or which VR data will be used.
So for fish eye to observe virtual reality, we are going to use the JPEG as our VR data. So to get this output from our pictures or from our 3D model, in our case we are going to use BIM model. We will use the software of Panoweaver or 3ds Mix.
And for the joystick type to navigate virtual reality, we are going to make the VR applications-- like APK for Android OS and EXE for Oculus Rift. To get these applications from our BIM model, we are going to use game engines like Unity Pro and Stingray.
So let's move on to the fish eye to observe VR. Sorry. So for our instructional demos, we are going to use our completed construction project. So for this chapter, we are going to generate virtual reality by 360 rendered BIM model. Sorry.
So this is our just completed office building, the Kyobo Insurance HQ office in Korea. And this was a remodeling project. In our BIM process, mechanical and electrical room is very important since from this room all the [INAUDIBLE] systems [INAUDIBLE] are being controlled and [INAUDIBLE]. So for better communication between general contractors, and for all the decision making, we built a virtual mock-up for this mechanical and electrical room.
OK. So when the [INAUDIBLE] process is done, you've got to have this claimed in daytime. In our case it is BIM [INAUDIBLE] model and it was the starting point. So when you open this [INAUDIBLE] file, we will export this file as FBX format. If you click the export there, there are a couple of options, but we are going to use FBX since we want to maintain the materials we used in [INAUDIBLE]. So after we export this file-- oh, sorry. We will move to 3ds Mix to render this file.
So open this 3ds Mix and go to import button and select the FBX file you just made. And before moving to the rendering, we need to put some lights and camera. And we use as many lights as possible since this is a kind of interior rendering. And then we located the camera as target camera. And then we [INAUDIBLE] to rendering.
So go to the rendering tab and there is a couple of options again. But we are going to use the panorama exporter. We are using this exporter since we want to make the output as like a panoramic image. One more thing is we usually save this file as a movie format since some of our clients want to play a movie file with the Quicktime player.
When all the rendering is done, you receive this pop-up window. So this window shows what the output of the JPEG will look like. And when the rendering is done, go to file and export. When you go to export, there is a couple of options again. There is one for cylinder type, and one is for a spherical shape, and one is for VR, but we are going to use export to sphere because if we use the spherical shape we can experience it as three-dimensional space in the virtual way.
OK. We can save this JPEG file. So put the name of the file and set the directory. And then we are ready to unload this file to [INAUDIBLE] VR.
JONGHOON KIM: So when you get a 360 panorama image from that BIM model, technically you can save the file in your smartphone and a smartphone geared with these VR glasses. And you can actually look around 360 images in an immersive way with this device. So this is how it actually looks like.
Technically, this is a computer generated model. So that means you can change the colors or textures if you would like to. And also-- well, one thing that I want to say here is this is just a look-around. You can look around when you move your head with this device, but with this option you cannot go through-- I mean, you cannot walk in the model. This is just at one point.
And the reason that you see two views in here is basically to match each view to each eye. By doing that, it just can generate a stereoscopic view with this device.
JUHEE RHO: OK. Then we will move to joystick to navigate virtual reality. So in this chapter, we are going to generate VR applications. One for Android OS using Unity Pro and one for Oculus Rift using Stingray. With [INAUDIBLE] company for VR contents and Autodesk Korea, we could develop this VR applications. They helped us with the technical [INAUDIBLE].
In order to understand the difference between these two devices and two game engines, we are going to test with the same project. So in this chapter, we will use our other completed project, [INAUDIBLE] orthopedics clinics. So in the hospital building, like the mechanical and electrical room and other rooms, the operating room has its importance.
So for better communication with our contractors and our clients to make better decision making, we built a virtual mock-up for this operating room, again, for this building. So before making [INAUDIBLE] the building as an application, we just set the area and scope where we are going to focus. So we choose two operating rooms, one in between the hallway and one [INAUDIBLE].
So from here, again, after [INAUDIBLE] the BIM process, we got to have this clean lobby file. We just deleted our other portions that we are not going to use. And then we save this file as FBX format again to use the same material we used in the [INAUDIBLE]. And in our case, we used [INAUDIBLE] of [INAUDIBLE] to separate this file out as three different files.
We can import the file with 3ds Mix again. Since we are making the VR application for Gear VR and Gear VR device either used storage of our smartphone. So it has a limitation in terms of maximum polygon numbers. So we need it to delete other surfaces we are not going to use. The maximum polygon numbers in Gear VR is around 50,000 to 70,000. So we reduce the number of surfaces and optimize this 3D model in 3ds Mix. So in 3ds Mix, you receive this editing toolbar at the right side.
If you click this rainbow icon, you receive these options. And go to this right triangle. Then you can select the surfaces. So select all the surfaces we are not going to see. Just delete them.
So after optimizing this file, just save this file as FEX format again. Then we can import this file in the game engine. So from here we are looking at the game engine, Unity Pro. Even though there is a couple of game engines, we choose Unity Pro for several purposes.
One purpose for that is since we want to navigate our BIM data, really. So in the game world, you can move fully as you want. So we borrowed that concept to our BIM model. And then for a second reason, we used Unity Pro since we are going to use Gear VR and then use the smartphone of Android OS. And this Unity Pro [INAUDIBLE] first a platform for Android OS and Gear VR is. So we use Unity Pro.
So after importing this file, we needed to do this control scale factors. Since in architectures [INAUDIBLE] like [INAUDIBLE] or 3ds Mix, we are using the meter-based [INAUDIBLE] system. But for the game engines-- game environments-- they have a different [INAUDIBLE] system, so we had to get through these steps.
So in Unity Pro, you will see these effect. And then go to the resources and then click the file you just imported. And then you will see this window. In our case, our scale factors was 0.15. Even when we used the [INAUDIBLE] for [INAUDIBLE], sometimes when we open the file in Unity Pro, we sometimes lost our materials. In this case, we need to set the shader again.
So again here, go to assets and select the resources and file name and you receive the materials. And then with these materials, you can assign the textures or number map, whatever you want, with the proper scale.
So then we draw bounding boxes. In the game engine, the camera can understand the physical character of walls, floors, or ceilings. So that means your camera can pass in through the wall or [INAUDIBLE] on the floor. So we need to go through these bounding box steps. So we drew all the bonding boxes around the walls and floors and ceilings. In this way, your camera can stay only in closed space.
And then we allowed it to put our camera. Since we want to look at our virtual space just like we are looking at the real world, we use the first person controller or camera for this project. So just put the first person controller. And for the next step, if you remember the first couple of steps from this chapter, we just deleted all other portions that we are not going to use.
So actually, this model has these weird ending surfaces. So to compromise this weirdness, we just put our background-- that image-- with the billboard complements. So if you don't want to use your JPEG, you can just put shadows or darkness.
Then [INAUDIBLE] build this [INAUDIBLE] applications. So go to build, and you will see this setting box. So you can put your company name or you can choose the icon for your VR application, and there is a couple of platforms you can use. There is IOS or PC or BlackBerry or Flash. But in our case-- again, we're going to use the Gear VR, so we choose the Android OS and build. So we allowed it to install the VR application we just made.
JUHEE RHO: So, once you get the APK file, now it's time for you to navigate the space virtually. So once you load an APK file to the smartphone and geared the smartphone with this Gear VR-- and this is how it actually looks like-- and with this device, the side of this device's glasses is actually going to work as a controller.
So once you touch this, you can actually walk in the space. And this is how it looks like with this device. You see some delays, but that's because of the limitation of the recording device. Actually, it is very smooth without any delays.
And, like Juhee just mentioned, we're using just a small smartphone. [INAUDIBLE] computing power of the smartphone. So we had to keep the model very simple by eliminating all the complex contents-- geometry contents. And see here in this room, we deleted a complex geometry. And even in the hallway, we deleted a lot of pipes and duct work and cable trays to make the file very simple.
JONGHOON KIM: OK. Let's move on to the second option to generate for your application. For Oculus Lift using Stingray. The Stingray is a brand new game engine of Autodesk. So same here. We used the same 3ds Mix file. And the difference is that we didn't delete other surfaces, even though we are not going to use these surfaces.
So, with this 3ds Mix, especially for the 2015 release of 3ds Mix, there will be an assimilater. So once you find this tab, just click the [INAUDIBLE]. And then you will find the same [INAUDIBLE] with the Stingray. And also there is an option with the live camera tracking. Once you click this, every change made with the 3ds Mix will be updated to the Stingray immediately.
So this is the video. The changes we made from the 3ds Mix will be updated. So changes. Then go to update. And in the Stingray it is updated. It is the same for the material changes. So when you don't like the material from the 3ds Mix, you can change these floors with other materials with some textures. So it is changing with the textures. And go to update. And in the Stingray it is updated. So this is the sweet part of this Stingray game engine.
So for the next tab, for the Stingray we need to put some lights. So it's very simple. Go to create and click a light. Then you can locate the lights wherever you want.
OK. So just like the bounding box in the Unity Pro, in the Stingray we also need to put some physical character for walls or ceilings or floors. So in the Stingray, we use physics actors. So it's a very simple way to put the physics actors. Just select over the object you want to put the physical characters and then go to create and click physics actors.
And then we allowed it to locate the camera. So go to create and select the camera. Then you can locate it. And then in the Stingray, you can write a script about the movement of this camera. So go to Story Editor. And then you will see this script you can write. So in this script, we can set the starting point of the camera movement and the jumping between the scenes and of the camera movement.
So here, again, we are ready to save this for your applications. The funny thing is that, in architecture software like [INAUDIBLE] or 3ds Mix, when you're saving the outputs [INAUDIBLE] like save or export, in the game engines-- in the Unity Pro, we use the terminology of build and here we use the terminology of deploy. So let's deploy this for our applications. So go to deploy, and also here we need to do some settings.
So add the property editors and deployer. And again here we need to select the platform for these VR applications. So for this we are going to use Oculus Lift, which is the Windows platform, and then click package project for Windows.
JONGHOON KIM: So yeah, when you get the EXE file done and now you can navigate the space virtually. So this time you navigate the space with the computer and Oculus Rift. Now, the keyboard in the computer is going to be the controller in this case. And as you can see here, compared to the previous version using Gear VR that just uses a small smartphone, in this case the computer has more storage and more computing power.
So that means the computer can handle richer contents like more lights and more details. And even in the hallway, you'll see a lot of pipes and duct work are still there. But it was very smooth and most of the texture of the materials looks better in this case.
So, yeah. We talked about how to generate the VR contents from BIM model. I'd like to talk about the virtual tour. Again, virtual tour is not a computer generated simulation. It is actually a simulation from an actual photo, but the funny thing is if you can generate 360 panorama view with an actual photo, you can also use that with this device and look around the space in an immersive way.
So compared to the computer VR, this is very simple and easy to generate the content. So we applied virtual tour to this project. This is an office building. Beanpole Gangnam office building. And we select a similar space like a mechanical and electrical room down at the basement.
So to generate the VR content, the first thing that we did was set up the point to take pictures. So we selected these six locations. When we chose these locations, we paid attention and we planned these locations very wisely so that every human in every corner in all the area can be captured.
And then once you select this location, for each location you need to take a picture-- about 10 pictures in all directions. And by doing that, when you stitch those photos you can get the full 360 full image. And also, you have to use some special device to take pictures. So here we use the fish eye lens and tripod with panoramic head.
So once you take 10 pictures for each location, you can import those individual 10 pictures into a stitching software. There are a couple of softwares available out there. We used the Panoweaver to stitch photos. So you have to import the source images. 10 pictures. Like here, these are 10 pictures for one location. And then you can stitch with the stitch button. When you click the stitch button, this software automatically recognized all the [INAUDIBLE] points and then it stitched into one JPEG 360 panorama picture.
And we repeated the same process for all six locations in this case. And we're done. So you can upload the file into a smartphone and gear it with Gear VR glasses. And you can see actually the space in an immersive way. So compared to the VR content, I say this is not BIM model. This is not computer generated simulation. This is actual photo. And this is more real, and you can capture [INAUDIBLE] conditions more accurately with this technology.
But the thing is, with this option, unfortunately you cannot walk. You cannot navigate in this space. You can use these contents only at a point. And also, the difference with BIM model-- when you have BIM model, you can generate the contents before actual construction. So you can use the contents to review constructability or communicate with clients or contractors-- subcontractors. But in this case, this is possible only after you finish construction.
So I think we covered all we have and here I'd like to summarize what we just talked. We introduced how Samsung C&T uses AR, VR, and virtual tour in the field of construction. And we introduced two different types of devices to experience [INAUDIBLE] content. One is a non-immersive way. You can just use desktop or tablet PC in a non-immersive way. And the other way is you can use Gear VR or Oculus Rift or other VR glasses to experience the content in an immersive way. And we also introduced how to generate the VR contents-- two options to generate the VR content. One is fish eye. Basically, look around a space-- virtual space at a location. We showed you how to generate the content with a BIM model, and also I just showed you how to use actual photos to generate the 360 panorama picture.
And the second option is joystick. That means you can experience the space with a controller. We can actually navigate the space. And we showed you how to generate the contents with a BIM model using Unity Pro and Stingray. And just before, I showed you how to generate the virtual tour contents.
So this is our conclusion. From various applications, we have seen that we can increase the satisfaction level of our clients, especially the property managers or facility managers were very satisfied. But we understand that these type of technologies are at the beginning phase and there are still many routes for further development in the near future.
So we're hoping that we can get more streamlined or more efficient ways to generate the VR contents in the near future like just generating the VR contents directly from software like Autodesk Rabbit instead of going through Rabbit, 3ds Mix, Unity Pro, or Stingray.
And I'd like to share a couple lessons that we had. So for those of you who would like to adopt this type of technology, there are many AR, VR, virtual tour hardware and software available. But before you choose which software to try, we would like to say that you need to think about the specific purpose of AR, VR, and virtual tour. You might want to use this for facility management perspective or for decision making or constructability or any other purposes, you have to have a specific purpose that it's very easy to select software because each software has their pros and cons.
And also you may need to go through many trials and errors processes. And by doing that, you will get to know the most optimized and satisfactory and best practice to generate those contents.
So those are what we have and, again, thank you very much to listening to our presentation. And we have about 15 minutes. We left 15 minutes intentionally to give you a chance to actually experience the virtual tour that we showed in this presentation. We have two Gear VR and one Oculus Rift device over here. So if you are interested in seeing how they actually look like, feel free to come. And [INAUDIBLE], if you have any questions, we'll open up for the questions.
[APPLAUSE]