AU Class
AU Class
class - AU

AR/VR Enhances Workflows, Fuels Collaboration, and Improves Decision Making

이 강의 공유하기

설명

Our panel of experts in the fields of construction, entertainment, and technology will share their learnings and successes from previous applications of extended reality (AR/VR), and they will illustrate how these inform future plans to build XR capabilities in construction and entertainment industries. With a mix of customers and Autodesk experts, we’ll focus on how extended reality can fuel collaboration and drive innovative results.

주요 학습

  • Learn about the mission of the XR-focused teams at Autodesk.
  • Get excited about the future capabilities that XR will enable.
  • Better understand how to engage with the ecosystem for successful XR applications.
  • Discover how XR can support designers, identifying impact of design decisions with a better understanding of the final results.

발표자

  • Nicolas Fonta 님의 아바타
    Nicolas Fonta
    Based in Montreal, Canada, Nic Fonta has 25 years’ experience. Prior to Autodesk, Nic worked for CAE, a flight simulation company; Electronic Arts in game development; and at Presagis in content creation. Within Autodesk, he’s validated real-time and XR adoption in the AEC market with Stingray and Revit Live, and was head of product management and strategy for 3ds Max. Then, as the XR Core Team lead, Nic was responsible for the XR strategy across the industries Autodesk serves. Recently, Nic became the GM for a new XR incubation project for AEC, which is the next chapter for Nic and the XR Core Team. Following the acquisition of The Wild, these 2 teams are joining forces to accelerate the Autodesk XR journey.
  • Brian Melton 님의 아바타
    Brian Melton
    Brian Melton is a Chief Technologist at Black & Veatch, where he helps embrace digital transformation and recognize its impact on project delivery for the Water business of Black & Veatch. Brian has been with Black & Veatch for 20 years. During this time he has had the opportunity to be a part of some of the largest infrastructure projects around the globe, including mining, hydropower, and water and wastewater treatment, conveyance and storage, frequently working with teams in North and South America, the UK, India and Asia. He has an extensive background in Building Information Modeling with respect to infrastructure projects. Brian supports and leads efforts that help enable the best quality and team experiences for the delivery of projects today; and continues to drive innovation efforts and promote positive disruption to enhance our delivery of projects for tomorrow.
  • Hilmar Koch 님의 아바타
    Hilmar Koch
    Hilmar Koch leads the research for the Future of Media and Entertainment at Autodesk Research. With partners, Research is exploring speculative scenarios and proofs of concept that might shape the Media and Entertainment industry. Hilmar previously led the practitioner’s group in Strategic Foresight at Autodesk. Prior to Autodesk, Hilmar spent most of his career as creative technologist at Blue Sky Studios, Industrial Light and Magic and Lucasfilm. He has led innovative technical and creative teams and collaborated with partners to redefine the storytelling experience as director of the Advanced Development Group. He has held roles as Director of Virtual Production, Head of Computer Graphics, and Digital Effects Supervisor. Hilmar’s filmography includes “Avatar,” “Star Trek”, “Transformers”, “Star Wars III”, “Star Wars VII” and “Harry Potter and the Sorcerer’s Stone,”. He was one of a team of 3 developing Academy Award-winning ambient occlusion technology used on “Pearl Harbor” in 2000. Early in his career at Blue Sky Studios he collaborated on the Academy Award-winning short “Bunny,” in addition to several other films and commercials. Hilmar studied at Columbia College Chicago and Technical University in Munich, where he earned undergraduate degrees in Arts and Mathematics respectively.
  • Guy Messick
    As an Architect, Technologist & Thought Leader, Guy focuses on understanding the connections between design and emerging technologies. Trained in the arts, design, and fine woodworking, Guy finds architecture to be the profession that best unites these disciplines. Upon the adoption of digital technologies in his design practice in 1989, capturing relevant information from the edge environment, and making it available for AEC processes is Guy’s passion.
  • Qian Zhou
    Qian Zhou is a Senior Research Scientist at Autodesk Research in Toronto, Canada. Her research interest spans across spatial perception, novel 3D interactions, and interfaces. Before joining Autodesk she received her PhD from the University of British Columbia, investigating the perceptual factors and 3D interaction in AR/VR with award-winning publications.
  • Viveka Devadas 님의 아바타
    Viveka Devadas
    Viveka Devadas is a Design Technologist and works as a Technical Specialist in AEC, XR and AI products at Autodesk. She adeptly resolves customer issues by building connections with students, architects, designers, Autodesk's global team, and the international Revit community. With her educational and professional foundation in Digital Architectural Design and Construction Management, she has had an enriching journey in the AEC industry. Her career path has seen her wear multiple hats, including an EU exchange Scholar, Design Architect, BIM Manager and Visualization Leader. Viveka describes this journey as both creatively stimulating and enjoyable on a daily basis. It has allowed her to explore her passions, notably in the VR/AR space, AI initiatives and Spatial Computing sector to address design challenges that are eco-friendly while redefining our interactions with people, information, and immersive experiences.
Video Player is loading.
Current Time 0:00
Duration 1:02:31
Loaded: 0.26%
Stream Type LIVE
Remaining Time 1:02:31
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

BEN FISCHLER: So we're going to get started very shortly. There's folks still sitting down. Thanks for joining us.

We're going to be talking about XR, which includes VR and AR, and how it enhances workflows, fuels collaboration, and, ideally, improves decision making, as well. My name is Ben Fischler. I'm a senior industry strategy manager at Autodesk. We've got several distinguished guests with us, as well as some of my colleagues from Autodesk.

Let's get started. Don't make investment decisions based on anything we say here. We'll just leave it at that.

[LAUGHTER]

OK, our guest today-- and just give a wave when I give you a shout-out. We've got Dana Warren from Yulio, Guy Messick with IA, Adam Chernick and Brian Melton. And from Autodesk, myself, Nick Fontana. Viveka is not with us today?

PRESENTER: Not yet.

BEN FISCHLER: Not yet. She might be a late entry. Qian and Hilmar. OK, let's get right into it.

So I think probably most of us are somewhat familiar with XR and the umbrella that it casts and holds all of these categories. So we consider XR to be a inclusive of AR, VR, and MR. For those of you who are new to these things, AR is generally the layer of virtual elements that are superimposed over real-world experience on different devices. And then VR is the full immersion environment.

And so we're going to be talking a little bit about all of these. And XR is the umbrella that we use to hold all of these together. I'm going to sit so I don't stand here at the whole time. And this is going to be conversational. We're going to get into how folks are using these technologies, what they think the value is, and what they expect for the future.

So this is, I think, an interesting question because for a lot of us who have been around these technologies for a while, they've become normalized. But I think we all have a point where there was that first exposure. For me, personally, it was with HoloLens V1.

I wrote a grant to get one of the early beta devices, and I helped build a-- it was an automatic transmission training application for the HoloLens 1. And that was the point, for me, where I realized, this is going to change how whole industries are going to function going forward.

And I'm curious to hear what folks in the panel-- when they had that moment, that first realization, first exposure. So, Adam, why don't you jump in. What was your first exposure?

ADAM CHERNICK: Absolutely. Man, my first exposure. Well, it was about seven years ago. I was working at a large architecture firm at the time, and we had a creative agency come in that was building some virtual and augmented reality tools.

They showed a few of the applications that they were working on-- pitching our large architecture firm to work with us. And I had seen-- they really had the first compelling, mobile augmented reality that I had ever seen.

And it really shook me. It blew my mind. And it was that day that I really took a step back, thought about the impact that these immersive mediums could have on our industry, and dove in later that night.

BEN FISCHLER: Do you do you remember what device and--

ADAM CHERNICK: It was, I believe, an Android tablet that was running a mobile augmented reality application.

BEN FISCHLER: what about you-- your first step in the journey, first exposure?

GUY MESSICK: First exposure would be about 30 years ago at a ski resort in Colorado where they had this VR experience thing. And it's a platform you ran around on, and it was super low polygon count. Right after that, I was asked to look at some VR apps for a medical client back in the day and realized, if you had a giant silicon graphics tower-- water-cooled tower-- and a lot of time and money, you could do it.

So I said, OK, that's cool. I'll shelve that. And then seven years ago-- a little over seven years ago-- we wanted to start looking at it. And I can't remember the exact moment that I saw something, but we talked to Angel Say and Russell Varriale who started InsiteVR-- came into a studio in San Francisco and said, we want to do VR for architects.

And I started looking at what they were doing at that time. And it was advanced. It worked on standalone which, for us, honestly, is the only way we can go. Clients do not want to see a cable going from their head to a computer. And from then on, it's been-- we funded it and kept up with it full-time since.

BEN FISCHLER: Can you touch on the cable point you just made, just a little-- I think we're going on a tangent here, but I'm just curious. Has that been-- is that a recent phenomenon, or was there a period of transition where people have just-- there's an aversion to the cabled experience?

GUY MESSICK: From the get-go, we invested in the VIVE early on and some of the earliest Rift stuff. And then we had the Gear VR, which is a phone in a case, as some of you may know. Much different experience-- people would much rather put that on at lower resolution and less comfort.

I literally saw clients physically back away from me when I held up a VIVE. Like, oh, no. I don't think so. And I asked them why, and they said, well, I don't want to have a cable and so forth. So we started from the beginning, saying, OK, we're focusing on standalone. And that's where we've been since.

BEN FISCHLER: Dana, what about you in terms of your-- where did things kick off for you?

DANA WARREN: Yeah, so for me, honestly, it started back when I was in university. I took architecture. And for anybody who took any sort of program like that, I think we can all admit to hearing at one point or another, "I'm just not seeing it" from our professors. And that drove me absolutely crazy.

And, at the time, I had no idea what virtual reality was. So fast-forward a couple of years. When I heard about it was actually when I started with Yulio. And, immediately, I saw a solution for the problem that I had in university.

And if any of you out there are architecture profs, I know you can admit to saying that at one point or another. And when I saw VR and what it could do, I said, we're never going to hear that again, whether it's in a classroom environment or whether it's with your clients. Because, now, you can literally put somebody inside of the headset and be in that virtual environment.

So that's really where it started. And I've been with Yulio for the past seven years. And we are, for anybody that doesn't know, a virtual reality software solution for businesses. Yeah.

BEN FISCHLER: And, Brian, what about you for your first step on the journey?

BRIAN MELTON: Sure. Well, you know what I always tell people-- you want to do VR, just to go buy a VR headset. Now you're doing it, right? And that's kind of where it started for us.

We happened to get the hardware for our team pretty easy. We had an internal growth accelerator team that funded disruptive technology, and thoughts, and other things. So we had hardware laying there-- the intimidating ones with the wires, and the boat anchor that was tied to it.

But actually thinking back here-- the first experience we had, because we had this hardware, and we were like, well, what do we do with it now, right? We knew it was cool. People were aware of it. But how do we make it something that's practical?

And I was thinking back just now. Nick, you might have actually helped us with this. We were doing a project out in California-- the Calaveras Dam. It was a hydro project out there.

BEN FISCHLER: That's a big one.

BRIAN MELTON: Yeah. So we did some drone scanning of that, so we had a mesh model that looked pretty impressive. And we ended up-- I think we worked with Autodesk to get that put in a really early version of, maybe, Revit Live or something at the time. But, for me, it was really impactful because it was cool to stand around and move in there.

But somebody added a hot air balloon up in there. so we got up in the hot air balloon, and we could look over to that massive dam project. And you had this sense of feeling.

And you didn't know what you were going to do with it at the time, but you're like, this is going to mean something for us in the way that we work and the way that we introduce people to projects and designs. So, for me, that experience made it happen. But if you want to do VR, just go buy the hardware.

BEN FISCHLER: Just to riff on that for a second, do you feel like large-scale projects like that benefit more, or is it really across the board? Or is there a project scale where you're like, we're definitely going to be including these experiences because of the scale?

BRIAN MELTON: I mean, there's a tool in your tool belt, right? You need to make a decision on when you need to use it. But I don't think, for me, the scale doesn't matter of the project. It's, what are you trying to convey? And is this a better way to tell that story than some other tool that you've got?

BEN FISCHLER: Right. Well, I want to shift gears for a moment and include some media and entertainment background. Hilmar, take us-- this is the wayback machine. What was your first step on the XR journey?

HILMAR KOCH: Yeah. That was when I was working at Lucasfilm-- also a customer of Autodesk at the time. And it actually started in AR, not in VR. We did build an immersive cave, which was stereo. So people were immersed in that space-- could actually perceive the virtual worlds that they wanted to build.

And the ability for them to actually get a sense of scale when make decisions based on depth and heights was that much more sticky. It was really obvious immediately that directors, even really high-grade directors who have trouble articulating an exact number-- once you give them the ability to sense the world in 3D and spatially, those decisions really get that much better.

So it was-- VR wasn't just a follow up to the AR experience, oddly enough. But that program is still in existence at Lucasfilm, and it continues to bear fruit.

BEN FISCHLER: Viveka, thank you for joining us. What about you for your first experience with XR?

VIVEKA DEVADAS: I think some of my experience goes back to my childhood memories playing with a View-Master.

BEN FISCHLER: Awesome.

VIVEKA DEVADAS: --and always curious with viewing an image through a lens. And I was so curious that I got on top of a table, hit myself against a wall. And the physician asked me, why did you do that? And then I said, I wanted to see the other side.

BEN FISCHLER: That's awesome.

VIVEKA DEVADAS: So, yeah, that's how curious I've been with images and still images. And I really wanted to see these images moving. So I started building houses made out of sustainable material. Growing up in a tropical weather, so I tested with bamboo.

Slowly, I started getting so interested with this world of 3D, and why can't we view things from a different viewpoint? So I got myself into architecture school, pursued a masters in construction management and then started experimenting with 3D software, like 3D Studio Max, Maya, and slowly tested VR apps.

But one definitely changing point for me was-- I came to Boston like nine years ago and very amazed by MIT and Harvard, so got myself to attend hackathons in XR every year. And it was really cool to see.

I was almost turbocharged-- my interest turbocharged with all the ideas spanning around in hackathons where you were put into groups of people you've never met before. And you could possibly come up with solutions in just a couple of 2 to 3 days. So it started opening out a wide-- unleashed the potential of XR that so many different industries could come up with so many different ideas. And this journey is ongoing. It doesn't stop at-- it's ongoing. And I think we are all in it together.

BEN FISCHLER: Qian, how about you?

QIAN ZHOU: Sure. Thank you. So I started in XR during my PhD building a spherical 3D display. That's why we call it a crystal ball. So, particularly, I started the fundamental spatial perception around it.

And then, I guess, one of the interesting characteristics that we found about that crystal ball is like, unlike those wearable, more common headsets, those stationary 3D display really have minimal motion sickness for people to try on. So that make it very optimal for designers to work on.

But the downside for that is very hard to render real scale large models-- so things like cars and buildings. So later I just switched to the immersive side of the headset and with a focus on novel interactions and interfaces, which is my current role here at Autodesk research. So we're really curious to understand, what are the different ways that users can express their intention in different input modalities and different data set?

BEN FISCHLER: Nick, you want to close us out with this question? What was your first moment on the journey with XR? It's been a journey for you now for a while.

NICHOLAS FONTA: Yeah, a year or two. So I think it started a little over 25 years ago when I was just out of engineering school. My first job, I was working for a flight simulator company in Montreal. And as a newbie, I had the crappy night shifts because simulators were running 24 hours.

And I ended up overnight on a flight sim for a MiG-29 for the Singaporean army. And it was actually some sort of an AR experience. So they had a cut cockpit of an actual MiG-29 in which you were sitting.

And you had a massive headset with tons of wires-- 25, 26 years ago. And it was allowing you to see the actual cockpit but cutting the edge where the windows were and actually rendering the outside, including the wings and everything. So once I had learned how to basically fly the sim, I spent a lot of hours flying the MiG-29 overnight.

BEN FISCHLER: Yeah. I was actually reminded of from Qian's comment about motion sickness. And I think, actually, the first exposure I had was at SIGGRAPH. It wasn't an Evans & Sutherland machine. I think it was an SGI Onyx.

And I felt a bit queasy. Yeah. It was a tougher thing to pull off back then without having people feel a little ill. So this is a big one. How has XR changed your work?

And I think one way to caveat this is apply some time scale to it. So if you want to talk a little bit about near-term-- last five years or so, if you look back, how have things changed? Or if you want to go a little bit further. Adam, do you want to kick this one off again?

ADAM CHERNICK: Yeah. Well, this is a big question. At previous firms, when I was diving into immersive technology and leading some research and development-- before I was at SHoP I was at HOK-- we were building some communication applications. I find that a lot of what immersive tools do is help with communication.

They're helping teams better understand space. They're helping people jump into the same space and talk about specific features that are difficult to understand in two dimensions. And they're doing it in a relatively intuitive way.

So the first piece of work that I was able to participate on that was changing how people were working through projects and changing process was a mobile virtual reality application that we were developing while I was at HOK. It was allowing remote teams across countries to all jump into the same application and do design reviews together. So it was working through process.

And then moving forward to today, at SHoP, we're using a lot of the similar ideas, using the Wild and using lots of multi-user, immersive mediums, again, to decrease the amount of time that it takes us to make decisions and make better decisions. Yeah, there's one specific application that we've been developing is mobile augmented reality app to drop buildings in one-to-one scale using iPads, largely because they're easy to deploy to, and they're a medium that everybody knows how to use very easily.

And we can go to a specific site-- go with our clients, 10 of them, bring five iPads, and look at what the building would look like in the future. We can give them options that they can toggle through and actually, again, sort of time travel into the future six years forward when it will be built.

We can understand, in six years, what those future shadow patterns are going to look like on the contextual buildings next to our building and understand the impact that these different design options will have all while you're standing in front of it, walking around it. So this has decreased the amount of time that it takes us to make decisions-- certain decisions in certain specific situations from two months down to two weeks, which is impactful.

BEN FISCHLER: And would you caption that as client review is one of the big--

ADAM CHERNICK: Yeah, client review is definitely one of the big ones. Yeah.

BEN FISCHLER: Brian, what about you? You want to jump in?

BRIAN MELTON: Sure. Well, I changed my-- I don't think we're at the destination, yet. I think we're still on the journey. So how is XR changing our work, I think, is what I would lean into.

And, for us, I think it's communication, for sure. It's a storytelling ability you get with VR. But the engagement-- everybody's busy. Emails going off all the time-- chat, IM. You have very little time with people to engage with them and really get some communication across.

And I think VR, for specific instances, is the best way to engage people because we see a lot more focus when they're in VR. What we say internally-- be here, now. They're in that meeting. They're having a conversation. They're not looking at their email. They're not looking at their phone, or their chat log, or their text. So, for us, it is better engagement and better communication when we really need those feedback from the clients and the project teams.

From a usage perspective, you know we're using VR a lot in design reviews for internal teams to have, I'll say, getting to more informal communication. But then we use it a lot for client workshops where we're getting operations staff and the client team to talk about safety and access with some of the projects that we do. So review and collaboration is big for us, and the engagement with those teams-- since we have very little amount of time with them sometimes-- is key.

BEN FISCHLER: So you mentioned that things are changing and continuing to change quickly. What are areas right now that you feel are in rapid spin and motion?

BRIAN MELTON: Well, this is one technology that it's tough to keep up with, I'll tell you that, from a hardware and a software perspective.

BEN FISCHLER: Yeah, big time.

BRIAN MELTON: But I think what we're starting to see is more of a multi-use device where we were just thinking, oh, design review. This is going to be great. And, now, we're thinking, well, this is going to be a better way to keep our culture while our teams are working remote as a company.

And HR is looking at it. Safety's looking at it-- training aspects. So I think remote presence and remote connectivity-- it's, eventually, just going to be another tool that's sitting on your desk like a mouse and a keyboard. And you're going to use it for certain meetings.

It might be design review. It might be a team meeting. It might be a workshop. But I think that's where the future is going a little bit.

BEN FISCHLER: This is a question for the group, I think. Do you find that there's the client experience, but then there's the process of extracting the client's feelings from the experience? They have the experience, but then they have to communicate to you what they got out of it. And I still find that that's something that is a bit of a challenge, right?

I'm from a film and TV background and games. It's the same problem you have where, when the director points at the screen and says, warm it up it and make it bluer, right? They don't always-- it's difficult to express what they want from the experience.

And I'm curious if anyone's working around, how do you provide tools to clients in a way that allows them to communicate what they're feeling and what their desires are from that experience? I don't know if that makes sense, but it's like, you're there. You provide them with this rich experience, but then they have to have the tools to then express I need more of this, less of that. So maybe that's things like annotation, markup, and so on.

ADAM CHERNICK: Yeah, yeah. Really, really quickly, I think that it's very important to set the stage and to curate the experience. That's really important with these new mediums-- making sure that you are talking about what questions we're trying to answer that day before we jump into an immersive medium because there are a lot of people that can get distracted by "The wood grain on the wall doesn't look correct" when that's really not what we're here to talk about today. And so, yeah, setting the stage is pretty important.

NICHOLAS FONTA: I'm curious. Do we feel that being in an immersive space makes that communication, at least, easier than if we were not in such immersive space for you and your stakeholders?

GUY MESSICK: I think it's more profound than that. And so we were talking about changed, changing. We talked about that. But one thing that hasn't changed since we first got into it is the profound nature of how clients and designers experience space in VR. It is profound.

And we found an article from 1997 in the Harvard Business Review. J. Rayport was one of the authors. You can look it up. They didn't know what XR was then, but they talked about empathic design and how you can more quickly ideate and innovate if you're experiencing a space.

And we saw that. Our second major client, again, seven years ago for a tower in San Francisco-- and their global head of real estate was there and wasn't known as being a real touchy-feely, warm person, necessarily and put on the headset, walked onto the main dining floor, which is the entire floor plate-- about 19,000 square feet-- and immediately said, this is too tight. Our people won't like it.

Now, I'm in the back waving my arms going, there are schedule and cost implications for what you just said. He said, no. We got to redo it. Now, this had been rendered beautifully. Head count had been worked out with strategies, plans approved-- no, clients can't read floor plans. They won't ever tell you they can't.

And right then, my jaw dropped. And I went, OK. That is a big deal. And we've seen that. So that continues, now. There's better hardware. There's better software. There's better articulation of design.

But that is a big one. And so if it's happening with the clients, it's also happening with designers. So we think that's one of the major ones we're seeing. That changed right away, and that is still there.

HILMAR KOCH: Can I throw something out here as a follow-up? Is it also changing-- I think you're speaking a little bit about client, and your client, and your company, but it's also about coordinating between the different groups that you're serving. Are we seeing change actively where it is a communication tool between the different crafts that are stakeholders on a plan?

VIVEKA DEVADAS: I think--

BEN FISCHLER: Go ahead.

VIVEKA DEVADAS: I think to add to Hilmar's point, we can also break down the model into separate disciplines because architects, engineers-- sometimes we don't speak the same language. So with applications like Navisworks, I think it's nice for us to separate out the different elements. If an architect wants to view if there's an HVAC clash in the structural system, they can speak the same language, and it helps them to get on the same page, no matter what stage of a design process they're in.

BRIAN MELTON: I was just going to add-- I think it was mentioned a little bit, but we've seen it. It is easy to get distracted in VR. And I think we need to be careful not to add a communication tool that gives us less feedback than we had before.

Model detail comes into play. If the wall outlets aren't in the right spot, that's not really what we wanted feedback on. Is the wall even located where you would like to have it? So I think it's tough to not get distracted.

I think I'd like to see some tools-- and I think Autodesk can help out here a lot where-- we talked a little bit about this yesterday-- of what's the purpose of being in VR? A lot of tools don't let me share what that purpose is. When we're in this meeting, there was an agenda. You knew what was going to be talked about. It's a structured conversation with a facilitator.

When we get into VR with eight people in [INAUDIBLE]-- boom. They just all go to different rooms. They start talking. They start looking around. Nobody knows where anybody's at. You lose the meeting.

So having the ability to curate that purpose that's there-- I think those tools need to evolve to where it's more of a structured conversation. And I think it's going to get there. The more formal that VR becomes in organizations, they're going to realize, hey, we do need to put more formal things on the way we interact with these tools and people.

BEN FISCHLER: You need the AutoCAD herding functionality. Dana, what kind of change have you seen in your work?

DANA WARREN: Yeah, so even just expanding off of that, that's something that we have really been focusing on is the collaboration aspect to it and understanding that things like that can get away from you pretty easily when you have that many people in one space and everybody's all over the place. So we have a collaboration tool, and you can guide somebody through the presentation.

You can focus their headset to a specific location or aspect of the design, and you always know where they are. You can give them the freedom to roam around, but you can also keep them in one place in the design, and I do think that's incredibly important.

My perspective is a little bit different, obviously, because I'm on the side of bringing the AR and VR tools to you. But I think one of the things that is changing as we speak is where VR and AR is being used. And a lot of the time, it was being seen as a special occasion tool that you would only use at final pitches or final presentations. But that has really been changing a lot over the past couple of years.

And I think the one thing that has helped that is how easy it is to use the tools that are out there today. They go with your existing workflows. You don't have to learn a whole new software or a whole new technology. It exists with the tools that you already use.

So we are seeing it being used from early project ideation through design development all the way through to final spec. So I think that is continuing to grow, and that's going to be the use moving forward. And like Brian said, it's going to be something that you use every day. It's no longer going to be seen as a novelty.

BEN FISCHLER: Let's see. All right, here goes crystal ball time. Next 5 to 10 years-- so go wild. What do you what do you see happening in the next 5 to 10 years? Guy, you want to kick it off?

GUY MESSICK: I would stick with 5.

BEN FISCHLER: Yeah. Pick your time scale.

GUY MESSICK: 10 is outside my crystal ball range. Being more ubiquitous across the spectrum, sure. It's going to be more accessible. Every week, every month we're going to see more and more of it.

It's going to be how you interact with design, and construction, and coordination globally. There's no doubt to me that's where it's going. And I think the level of-- one thing to keep in mind is wherever you are in this world of AEC or manufacturing and media and entertainment-- because we see a blend now of media stuff. We have people working in Maya building virtual environments.

But, for us, I think one thing that may be interesting and you may see is we have now two businesses, if you will. And we do interiors. We're on a whole different level of time-- 6 months, not 6 years.

And we do see the outlets in the wood grain. It's absolutely what people are looking at for interiors. We do tens of millions of square feet a year. That's important at that level, so keeping that going-- that accessibility across the range.

But we have a group that builds environments and designs for virtual space only. It's not going to be built in the real world. And we're seeing more and more of that.

So when you talk about getting your groups together and keeping that culture going, we are seeing a flood of work for clients globally. Right now, how do we build something that has our brand, our feel, what's important to us as the culture of our company? And then, where we going to meet? Because we're going to be remote part of the time.

I think there's another question coming where I think that's another thing to bring up is, what is the reality of how people work and live now? It is different than 2 and 1/2 years ago, fundamentally. I think this platform is going to have a lot to do with that in the next 5 to 10 years. How do we work and connect?

BEN FISCHLER: Qian, I want to hear a little bit from research on what you're going to predict for 5 to 10 years. Or what would you like to see? We can--

QIAN ZHOU: I think-- for sure. I just want to add a little bit to the one that you mentioned about working remote. I feel like in 5 to 10 years, I would say a lot of potential of XR used to help us better connected. And I feel like this connectiveness can apply to two folds-- one is, probably, people connection that help people work from home or when hybrid meeting become a new norm that we need technology to better support those needs for us to help with the better collaboration and communication.

But I also feel like this connectiveness can be data connection so that we have better way to visualize and interact with the data-- with the larger data kind of bandwidth that we human can handle in different modalities. But it also probably will be the data connection to the greater ecosystem so that XR should not just connect to the cloud, but you also can connect to desktop, tablet, and mobile.

I think some of them are already start to appear, for example, communication with the tablet to share the view, to handle the previous question on the client's needs or client's of view. But maybe there would be more about this data connection connect to the greater ecosystem in the future.

BEN FISCHLER: Nick, do you want to jump on this one?

NICHOLAS FONTA: I can a little bit, yes. So, yes, to everything that was said. Of course, devices will be cheaper, more comfortable, easier to use, longer-- all of this will happen, for sure. So there's a lot of different things happening on multiple fronts pushing in the same direction.

But I think the two things are going to be probably quite different than what we've seen so far. First is a better understanding, from our part, in terms of what the human interactions and interaction patterns will have to be so that you could really be fully comfortable for long periods of time doing many different tasks and jobs in an XR environment. So that's one.

And the second is, I'm betting that we'll see a lot more fluidity between full VR and full AR both in terms of devices, but in terms of use cases that we'll want to use XR for. And it will be a mix both of the jobs that we're going to be doing XR and also a mix of the experiences that we're going to be having collaboratively.

So they have people in different mediums, different AR on the spectrum of XR at different places just all interacting together. So I think that's going to really change how this can be done.

BEN FISCHLER: Just to add on to this question, I'm curious for the group, are there any devices, device form factors that you're particularly excited about? And we don't have to get into specific manufacturers, but it feels to me like we're about to-- I don't know if we're on gen 3 or gen 4 of some of the headsets.

But it feels like there's a bunch of interesting hybrid pass-through designs and things like that. I'm curious, what are folks most excited to see hardware-wise and devices and what we might be able to do with those? Anyone want to--

GUY MESSICK: I think the windmill I tilt against is augmented reality. And I don't focus so much on tech. I focus more on people, I'll be honest, these days. How are people using stuff?

But what what we've wanted for years was to walk into a cold shell, put on a device that's easy and light and see the design, and walk around and interact with it, and see the real world so you're not bumping into columns. But that's the thing where we're going to get to.

We have a-- part of our firm does giant art installations for building repositioning. We need to see that in AR. That's huge. But that stickiness in the tech isn't quite there yet.

Is that a hardware and software problem? Yeah. But that's where we want it to go. And I can't tell you which company is going to do it. But as soon as that happens, we will invest.

BEN FISCHLER: Yeah. Anyone else?

DANA WARREN: Yeah, I can add on that, for sure. We're definitely not there yet in terms of the hardware that's out now and even, honestly, coming soon. The wearable technology-- it needs to get to a point where, like Nick said, people can spend extended amounts of time in there.

And it's a really challenging issue with nausea and motion sickness. And it comes down-- it has a neuroscience component to it. It's something called vergence-accommodation conflict. So it's the disparity between what your eyes are seeing right here in front of you versus what you're actually looking at.

So it's something that we're nowhere close with the headsets that we have today. And over the next couple of years, I'm not sure if we'll get there. But there are solutions and steps that we are taking now to combat some of these issues, like stereoscopic imagery, for example. So that's what provides the depth in the images. But we need the advanced advancements with the hardware in order for this to be something that is accessible to everybody.

BEN FISCHLER: Anyone else?

HILMAR KOCH: I'll throw in bandwidth and 5G into the mix because I think there is this opportunity for people to get-- have a desire to get the world augmented with new things and data visualization of the things that they want to see. There's so much that we could bring to the device out in the field.

But hauling around our little computers that do the work seems to be impractical. And the data is also massive. So I'm hoping that 5G is going to have a little hand to play in that to bring data and bring even Compute to the edge so we can have a richer experiences out there. That's my hope.

ADAM CHERNICK: Yeah and to follow up on that, I absolutely agree with that. We're seeing there's some really interesting technologies, such as Pixel Streaming, which are allowing us to use the computational power of servers in the cloud and not the computational power on these mobile devices, which is allowing us to run much more rich and computationally-intensive experiences on these lighter-weight devices, which is really important.

And so to follow that same trajectory, what I think is really interesting is looking at some of the acquisitions that some of these large, real-time companies are making to help us see where all this is going. Some of the acquisitions by Epic Games are really, really interesting. Their acquisition of Sketchfab, Quixel-- Megascans is another one.

What we're going to see is that in the past of real-time technology, what we've always done, typically, is if we wanted to-- we're rendering a building, and you'll see the front lawn in front of the building, and there's some rocks there. To create that rock, we always had a three-dimensional artist who looked at pictures of rocks, and maybe went outside and grabbed a rock, and then tried to three-dimensionally model something that was really close to what they're seeing.

But, now, there's a fundamental shift happening, which is, we're actually going and we're scanning the real world, and then we're bringing the real world into these virtual environments, which is going to make the realism and the richness of these virtual environments indistinguishable, in a lot of circumstances, to the real world.

And this fundamental shift is definitely, I think, where we're going and is going to, again, bring adoption because this is something that people want to see. They don't want to go in and see a cartoony environment that doesn't look very good. They want to see something that looks incredibly realistic, and I think we're going to see just that.

HILMAR KOCH: Does the audience have any skepticism to express about technologists speaking about what's going to be possible in the 5 to 10 next years? Happy to have that debate, too, because, yeah, just some call this the third VR event winter. And what do we need to crack to make you successful is one question that's out there.

BEN FISCHLER: We will have time for Q&A, or do we want to jump in to questions right now?

GUY MESSICK: There's a microphone right there.

BEN FISCHLER: Yeah, I'm down for questions right now. Do you want to come on up on the mic, or--

PRESENTER: No pressure. Come stand in the spotlight.

AUDIENCE: Yeah, really. It's a two-parter. One is, the one thing that I've noticed in our industry is by the time we get invested in these workflows, I think it's too late. I'm curious, do you guys have a R&D department to go through these workflows when you guys are first starting-- tinkering, trying to figure out what's going to work or not?

GUY MESSICK: For us, yes. But that's one of the groups I run called Design Intelligence. But it's not a big, formal group of 20 people doing just research. We implement for use right away.

But I've got a pretty simple world. The way I look at things, ROI is use. So people are using it and effectively working better, that's great. If they're still using it to drive revenue, that's wonderful, too.

But we test. We implement. We work on pilot groups, and we have to have a very rapid iteration. But, yeah, we test first and move forward.

By the time you buy it, will it be too late? I'm more than happy to talk to you later about that. I don't think it's as tight a thing is it seems.

AUDIENCE: The reason I ask that is because in our industry, we're all production-driven. So when you're doing these workflows, and tinkering, and trying to figure them out, it's too late.

GUY MESSICK: Yeah, it's got to work right away.

AUDIENCE: Yeah, it's got to work now in the project.

VIVEKA DEVADAS: I think it's easy to get carried away with all the hype. Every day, we have a new headset, and every other day, there's some app. So it's very, very easy to get drawn into. And, OK, that organization is developing something.

But I would say, definitely look closely within your own organization. And map those things early on, and identify what is key for your organization, be it your teams, or be it your applications, or be it your requirements. Maybe you're just looking to do a high-end visualization, or maybe you're looking at something higher, like you're maybe designing a stadium, or maybe you're designing something which needs more daylighting or sustainable aspect.

So I think all that boils down to your own organization's value proposition, which you should convene as a team and outline it at the very first and then jump into the technology. So I think that, early on, mapping is very crucial.

AUDIENCE: Go ahead. Go ahead.

BEN FISCHLER: Well, actually, I think you raise a really great point. And I want to throw this out to the entire room. Just a show of hands-- how many folks in the room feel like the pace of change or the cycle of change within XR technology has been too rapid or problematic for you in your business? OK, so-- I mean, it's--

AUDIENCE: It's a good thing. But we're in that-- we're in the process of having to learn all these things in the middle of production work. And it's just not working. The change is happening too fast, which is good but bad for production work.

BEN FISCHLER: Double-edged sword, for sure.

BRIAN MELTON: One of the things I'll say-- I work at Black & Veatch, so it's a large company-- 8,000 to 10,000 people. Doing some research is easy, and it's good to have overhead resources to focus on those things that drive faster. But one of the challenging things is getting up to scale.

And I think what we're seeing is that you used to come to work at a company like Black & Veatch, and it's fantastic technology. You were wowed when you walked in the door. And now we're getting pulled to the future by a new hires that are starting saying, why don't I have VR on my desk? Why doesn't my timesheet work on my phone? They're asking for these things.

So the demand is changing. We can't drive technology fast enough at a large organization. So I certainly feel your pain. Even when you do have a lot of people on overhead trying to figure out how we move faster, we can't keep up.

AUDIENCE: My other question is, personally, I've had trouble with GPS and GIS data being tied to a location-specific VR and AR. I'm curious, has anybody-- Autodesk used to have a tool that was embedded in InfraWorks that would project the site based on the GPS location. And, now, the tool is gone. And I was just curious-- I'm trying to figure out that workflow, and it's been a difficult task to figure out. Somebody from Autodesk, please.

[LAUGHTER]

Help me.

HILMAR KOCH: I do not know the answer to your question. I'd be happy to follow up within the company. Honestly, I don't know.

ADAM CHERNICK: We have a few solutions that we use. Happy to chat about that, as well.

VIVEKA DEVADAS: Yeah, I can briefly touch on that. We have Recap, and then we have hardware to gather point cloud data. And then we can bring that point cloud data into InfraWorks or Recap and then take out the contour lines into Revit, and then you can build your site on top of that. But I'm sure there are additional tools, and there's more to come.

NICHOLAS FONTA: I don't know if I want to go there.

BEN FISCHLER: Go for it.

NICHOLAS FONTA: I'll jump in front of the the bus, I guess. I don't want to propose a solution, but I want to highlight the fact that I think what you're talking about is a real problem. More broadly, as soon as you start going more in the MR and AR spectrum when you want to go on-site on the factory floor or on set in MA, the positioning, location, and tracking of your environment, especially on big projects, of course, becomes crucial.

There are solutions out there that are semi-working. What I'm really excited about is to think of ways that we're researching to do this as automatically as possible. And I think that will be the key to unleash the real power of AR on sites and on factory floor is when things will be automatically geopositioned, locked, and tracked as you move around.

And we're hoping that the solution to that problem-- part of the solution to that problem is a real deep understanding of CAD data and BIM data so that we could derive-- connect to that data, make the right connections, and connect that in the physical world through AI and machine learning. So it's still very early, so I don't have a solution to your problem. But I can tell you that we realize that this is a real problem that we need to crack.

ADAM CHERNICK: And speaking from outside of Autodesk, I'm personally very optimistic. The ecosystem that's developing-- connected data, the open API ecosystem-- where now we're able to connect so many different types of tools around-- Esri is partnering with Autodesk and with these game development real-time engines, such as Epic Games and Unreal Engine. And so the integration between all of these tools is just going to become more and more seamless.

BEN FISCHLER: Well, we sort of jumped into this already, I think-- greatest challenges. It sounds like one of them is just friction. Where do we remove friction from the processes, right? And I think that's partly what was described there. Anyone want to tackle greatest challenges?

BRIAN MELTON: I guess I might jump in and say, for me, I think it's a journey about data management when you look at it. Because what we're talking about is content or data and the way you interact with it. And we're seeing that change the value of spatially aggregated data.

We're trying to answer that question now, especially on the owner side. You hear the buzzwords around digital twins and whatnot. We're doing a lot of cool things with analytical dashboards, computational dashboards, BIM data. All that stuff is starting to look like it's converging and changing the next level that we're going to be able to interact with that information.

I think, for me, it's that data journey, right? You don't get anything unless you manage your data, understand the data, and look at the way you want to interact with that data, whether it's a model, or a map, or a pie chart.

BEN FISCHLER: Do you feel like-- I mean, firms are having to learn how to manage data in ways that they never had to before, right? They're saturated.

BRIAN MELTON: Oh, for sure. I mean, if you look a survey, you'll easily see that-- I think one of the last ones I saw, there was about 40% of the respondents said they're collecting tons of data but not using it effectively. I think they're getting inundated with information and are really struggling to understand, how do I put that information to use?

And that's just to see the current state where some of the systems that we're talking about now are envisioning where it's not just-- you don't just need to see the current state. You want that system to tell you what's going to happen so that you can say, well, what's the likelihood of failure? What's the criticality of failure?

So better decision making is what we're looking for, but it starts with that data. And then, again, is there value in spatially aggregating that? We talked to one customer. A security alarm went off, and they didn't have anything that really indicated exactly what door, so they just went around pulling on doors.

And that's a very simple example of, well, if that data system or that security system was tied to a BIM model, it would clearly indicate the door that you needed to go interact with. That might not be a VR engagement. You know that's just another level of interaction. But that value of that spatial aggregation is certainly a question.

BEN FISCHLER: Greatest challenges?

GUY MESSICK: Contracts.

[LAUGHTER]

BEN FISCHLER: Perennial--

GUY MESSICK: Well, I think that XR has a good shot at helping all of us get in the same room and see a little differently. It's still far too adversarial and siloed up in our perspective. And we're seeing more and more integrated projects.

But we're seeing more clients challenging us to do better, now, with our consulting partners, with our builders, and them. They want to be in the sandbox with us and seeing things. Like I said, our iteration rate is rapid. We go from design to construction very quickly. And it's got to be perfect because you're living with it. It's right there and highly articulated.

We learned about fidelity yesterday at an advisory group. And somebody said, well, yeah, fidelity is data you can trust. And [EXPLOSION SOUND]. There you go. Can you trust what you're seeing in extended reality and the data behind it? Because some groups care about the BIM data, about that thing you can select in XR. Some people don't. That's not remotely what they're concerned with.

But if it's there, and you can rely on it, maybe we can start looking at different ways of integrating design, clients, building AECO around. But that's the challenge. I think I also see some optimism into some solutions with XR tech.

BEN FISCHLER: Trust, for sure. Anyone else on the challenges?

ADAM CHERNICK: Yeah, I think one of the biggest challenges is adoption and the ways that we're trying to address that challenge. How do we bring adoption for new technologies, whether it's XR, immersive mediums, or any new piece of software? It can't just be good. It can't just save your team time.

It has to be so much better than the other way that they were doing something that they are willing to go learn something new. The human brain is trying to find the easy way out of every situation. And so if they know one way of doing it, and they don't need to learn a different way to do it, they're not incentivized to do that.

And so giving them that incentivization, advocating for it, championing it. Even if the tool is good enough that it would sway their decision making and have them learn that new, fundamental tool, there has to be someone there to advocate for it to help them understand that it is that good, that they should try to make that push in that direction. And so that's definitely a big challenge.

QIAN ZHOU: Yeah--

BEN FISCHLER: I want to make sure we-- oh.

QIAN ZHOU: Oh, sorry. I was just going to add a little bit on how to make the tool that can really distinguish from the existing solution. I guess, from the research side, one of the greatest challenges that we see is a gap in understanding users' intention when they design and make. What are they trying to achieve from step to step? What is their ultimate goal of the creation when using the software?

So I think XR here maybe really has the potential for us to better capture and understand user's intention but also provide them tons of ways to express their intention. Maybe from the capture side, I think, well, XR is naturally instrumented with those sensors, so we know-- for example, the system knows where the user is looking at, what are you pointing at, the context around it. So with all these cues combined together, maybe it has a better chance than the traditional software to figure out, what is user's intention when they're trying to create?

And then for the expression side, I think users have tons of ways to express their intention in XR. They can do 3D sketches to express a concept. They can record their motion and use that motion to create animations. I think all those possibilities adding together, maybe, we have a better way to facilitate that adoption with those capabilities.

BEN FISCHLER: Well, we're getting close on time. I think this-- we've got this question-- most important investment for teams now? And I also want to open it up to the audience, if you've got questions you want to jump in with. Sure.

AUDIENCE: One question I had with regard to embracing new technologies, but also, at the same time, maintaining the proprietary information integrity moving forward as far as how you guys-- whether it's testing new systems or things of that nature but also trying to balance that against protecting your IP.

BEN FISCHLER: Who wants to take that?

GUY MESSICK: Yeah. That that's a big deal for us. As I said before, I've been an architect a long time. And I think we, in our business, we tend to overdo IP value. Don't say that. Don't send someone a title block. It's like, come on.

But our clients-- start with the client side, again, the people side. We've had to develop and work hard on security, and credentialing, and how do you work enterprise from the beginning. Same thing with our IT folks. That's why we were waiting for certain aspects.

But I think the protections are a little bit primitive, right now, in a lot of XR platforms, including ones we're looking at building ourselves. But our model tends to be maybe more of a BIM 360 aspect of parent/child, easy to understand permissions and access where we're going. But, yeah, don't discount that.

Don't go walking into a client saying, hey, we're all going to do VR next week, and here's some headsets. And they're like, how are people logging into this? From their Gmail? Probably not. But as far as protecting your stuff, I don't worry about it too much initially because as you're iterating a design, next week, it's different.

But, yeah, final builds and products-- like I said, our other business is creating virtual worlds-- those have to be locked down for our clients. They don't want anyone else in there except who they want and when. So it's that whole spectrum, if that makes sense. It's in the agreements. It's in the contract.

AUDIENCE: What's the current status of HoloLens? My firm-- we're a turnkey engineering firm. And we had been looking into field verification during as-built.

And we know the HoloLens 2 got released, and then Microsoft killed it. It's like, did someone pick it up? What's the status of HoloLens or that type of technology?

GUY MESSICK: Well, I'll tell you I know, and you guys can jump in. We work with Microsoft, and we have a fair number of HoloLens 2. And what we're waiting for is Mesh to become a real boy, if you will.

So there is a lot of news saying they were killing it, but we're talking to a team they brought in, so there are real humans we're talking to on the Mesh team-- in particular, mixed reality. So it is real, from my perspective. We continue-- we're not selling our headsets. But they also are not approving it for commercial use yet.

So Mesh is something we want to work. We made a-- I'm being very cautious in my investment. And I hope it works because there's nothing else that's going to do that right now. There is no other hardware or software platform that's going to allow us to do that copresence and working with stuff not only on site, but in a common room with people in multiple locations. So we're still there with it cautiously.

AUDIENCE: What is mesh?

GUY MESSICK: Mesh is an app platform that works in the HoloLens 2 and other devices, supposedly. They didn't want to be gatekeepers, according to the big Microsoft-- I forget what it's called-- the big event they do every year.

BEN FISCHLER: It's kind of like their XR software layer that sits across multiple OSes, I think.

GUY MESSICK: Yeah, they had their CEOs say, we aren't going to be gatekeepers. It's not going to be only Teams. So far, it kind of still is. But we'll see where it goes. But Mesh is a platform that allows you to experience things in a HoloLens seeing your environment and then other people can bring their avatar or Holo avatar in and share objects, and paperwork, and screens, and videos with you wherever you are.

So it's more of a copresence than being in a headset and not seeing the room. You are seeing where you are, and they are interacting with you simultaneously. For us, it's about the content, and the other devices, and other platforms to work with.

AUDIENCE: So it's still being developed. It's just not as advertised right now?

GUY MESSICK: Right. It's not out for commercial yet. You can download the basic version onto a HoloLens now. You buy a HoloLens 2, you can install Mesh. And you can get a hold of Microsoft and say, I'm going to talk to the team and start-- you got to work out a little bit. But they're open to talking about where they're going. We're waiting for the SDK right now.

HILMAR KOCH: Obviously it would be unwise of us to comment on anything Microsoft strategy. So keep up on reading the internet there. But I would like to throw out an invite for a discussion afterwards about both of these two questions because I think what we need-- what we're used to in the real world about different trades working together, making contracts, finance systems working together-- I believe in technology and the virtualization and digitalization of all of these processes, contracts, and regulation, and the forces of the real world will start to take hold of technology forces, as well. So that's a discussion that I'd like to take outside afterwards, maybe, if you're interested.

BEN FISCHLER: Question?

AUDIENCE: Yeah. So we touched on how the expressive elements will be captured more in the next generation of headsets mostly because one of the main drivers is a social platform. So my question really is, how do you think we will or won't capture that data, say, in a client-facing meeting? Because a popular thing that's said is 90% of communication is nonverbal. So how do you think we could maybe heat map or do something to capture that data so it's not just lost the second they take off the headset.

BEN FISCHLER: Great question. Who wants to--

DANA WARREN: So with Yulio, we do have a heat map feature. So if you send your client your project and they're wearing the headset, you can actually see the exact journey they took through the project. And you can see, essentially, how much time they were looking at one area of your project versus if-- you can literally track their movement through the space, essentially.

So you can analyze it after using the heat map data. Or if you're on a collaborative session with them, you can see what they're seeing in the headset, whether you're on a Teams meeting with them at the same time, or if you're not verbally communicating. You can do it either way. So just a couple of tools that are out there now so you can do that.

BEN FISCHLER: Go for it.

BRIAN MELTON: OK, I'll jump out front. You can correct me if I'm wrong. I was just going to say, I think the expressive piece is a big part of meetings interaction with people and clients. And we have a standing meeting now that we use-- Horizon Workrooms. I don't know if anybody-- that's Meta's work room app or whatever.

It's a super interesting experience, if you haven't tried it. But it gives you a little flavor of that interaction back. We were on a meeting a couple-- last week, maybe. And we were sitting next to each other. And it's got spatial audio, and we said, hey, let's high five right here.

And we didn't realize when we actually smacked our hands, sparks would fly, and you would hear a noise or whatever. That was the cool thing we learned. But it's that interaction that you get as people, and you're starting to see that recreated in these environments. And I think bringing some of that back to how we not just interact in a neat workshop room like that, but how do we do that in collaborating in a model or in a workshop with a client? It brings that level of personality back to that connection.

BEN FISCHLER: And you get kind of emotional context.

NICHOLAS FONTA: Can I riff off of that?

BEN FISCHLER: Yeah, please.

NICHOLAS FONTA: So two things. To Brian's point, there are solutions out there, including Workrooms, including our own at Autodesk that we, on my team, we started using to meet-- not to discuss our own product, just to meet instead of meeting in Zoom. And there's reasons for that.

The main reason is that we feel that the sense of social presence in these experiences, even though our avatars look cartoon, the sense of social presence is more accurate, more comforting than over Zoom. It's easier to talk with 3D spatial audio, for instance. And so as the technology keeps evolving, I have no doubt that our ability to communicate and get those social cues and those body language cues that you were talking about in those experiences will be much better than over Zoom or Teams, whatever your favorite tool is. Now--

BEN FISCHLER: Unfortunately, I think we are--

NICHOLAS FONTA: We're out of time, are we not?

BEN FISCHLER: We're at time. We could talk about this stuff for a while. I think we've got to close it out. Thank you so much, everyone, panelists and guests. Hopefully this was an enriching conversation, and we'll see you out in the show.

[APPLAUSE]

Downloads

______
icon-svg-close-thick

쿠기 기본 설정

오토데스크는 고객의 개인 정보와 최상의 경험을 중요시합니다. 오토데스크는 정보를 사용자화하고 응용프로그램을 만들기 위해 고객의 본 사이트 사용에 관한 데이터를 수집합니다.

오토데스크에서 고객의 데이터를 수집하고 사용하도록 허용하시겠습니까?

오토데스크에서 사용하는타사 서비스개인정보 처리방침 정책을 자세히 알아보십시오.

반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

이 쿠키는 오토데스크에서 사용자 기본 설정 또는 로그인 정보를 저장하거나, 사용자 요청에 응답하거나, 장바구니의 품목을 처리하기 위해 필요합니다.

사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

이 쿠키는 오토데스크가 보다 향상된 기능을 제공하고 사용자에게 맞는 정보를 제공할 수 있게 해 줍니다. 사용자에게 맞는 정보 및 환경을 제공하기 위해 오토데스크 또는 서비스를 제공하는 협력업체에서 이 쿠키를 설정할 수 있습니다. 이 쿠키를 허용하지 않을 경우 이러한 서비스 중 일부 또는 전체를 이용하지 못하게 될 수 있습니다.

광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

이 쿠키는 사용자와 관련성이 높은 광고를 표시하고 그 효과를 추적하기 위해 사용자 활동 및 관심 사항에 대한 데이터를 수집합니다. 이렇게 데이터를 수집함으로써 사용자의 관심 사항에 더 적합한 광고를 표시할 수 있습니다. 이 쿠키를 허용하지 않을 경우 관심 분야에 해당되지 않는 광고가 표시될 수 있습니다.

icon-svg-close-thick

타사 서비스

각 범주에서 오토데스크가 사용하는 타사 서비스와 온라인에서 고객으로부터 수집하는 데이터를 사용하는 방식에 대해 자세히 알아보십시오.

icon-svg-hide-thick

icon-svg-show-thick

반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

Qualtrics
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Qualtrics를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Qualtrics 개인정보취급방침
Akamai mPulse
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Akamai mPulse를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Akamai mPulse 개인정보취급방침
Digital River
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Digital River를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Digital River 개인정보취급방침
Dynatrace
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Dynatrace를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Dynatrace 개인정보취급방침
Khoros
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Khoros를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Khoros 개인정보취급방침
Launch Darkly
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Launch Darkly를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Launch Darkly 개인정보취급방침
New Relic
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 New Relic를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. New Relic 개인정보취급방침
Salesforce Live Agent
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Salesforce Live Agent를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Salesforce Live Agent 개인정보취급방침
Wistia
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Wistia를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Wistia 개인정보취급방침
Tealium
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Tealium를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Upsellit
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Upsellit를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. CJ Affiliates
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 CJ Affiliates를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Commission Factory
Typepad Stats
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Typepad Stats를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Typepad Stats 개인정보취급방침
Geo Targetly
Autodesk는 Geo Targetly를 사용하여 웹 사이트 방문자를 가장 적합한 웹 페이지로 안내하거나 위치를 기반으로 맞춤형 콘텐츠를 제공합니다. Geo Targetly는 웹 사이트 방문자의 IP 주소를 사용하여 방문자 장치의 대략적인 위치를 파악합니다. 이렇게 하면 방문자가 (대부분의 경우) 현지 언어로 된 콘텐츠를 볼 수 있습니다.Geo Targetly 개인정보취급방침
SpeedCurve
Autodesk에서는 SpeedCurve를 사용하여 웹 페이지 로드 시간과 이미지, 스크립트, 텍스트 등의 후속 요소 응답성을 측정하여 웹 사이트 환경의 성능을 모니터링하고 측정합니다. SpeedCurve 개인정보취급방침
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

Google Optimize
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Google Optimize을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Google Optimize 개인정보취급방침
ClickTale
오토데스크는 고객이 사이트에서 겪을 수 있는 어려움을 더 잘 파악하기 위해 ClickTale을 이용합니다. 페이지의 모든 요소를 포함해 고객이 오토데스크 사이트와 상호 작용하는 방식을 이해하기 위해 세션 녹화를 사용합니다. 개인적으로 식별 가능한 정보는 가려지며 수집되지 않습니다. ClickTale 개인정보취급방침
OneSignal
오토데스크는 OneSignal가 지원하는 사이트에 디지털 광고를 배포하기 위해 OneSignal를 이용합니다. 광고는 OneSignal 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 OneSignal에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 OneSignal에 제공하는 데이터를 사용합니다. OneSignal 개인정보취급방침
Optimizely
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Optimizely을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Optimizely 개인정보취급방침
Amplitude
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Amplitude을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Amplitude 개인정보취급방침
Snowplow
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Snowplow를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Snowplow 개인정보취급방침
UserVoice
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 UserVoice를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. UserVoice 개인정보취급방침
Clearbit
Clearbit를 사용하면 실시간 데이터 보강 기능을 통해 고객에게 개인화되고 관련 있는 환경을 제공할 수 있습니다. Autodesk가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. Clearbit 개인정보취급방침
YouTube
YouTube는 사용자가 웹 사이트에 포함된 비디오를 보고 공유할 수 있도록 해주는 비디오 공유 플랫폼입니다. YouTube는 비디오 성능에 대한 시청 지표를 제공합니다. YouTube 개인정보보호 정책

icon-svg-hide-thick

icon-svg-show-thick

광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

Adobe Analytics
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Adobe Analytics를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Adobe Analytics 개인정보취급방침
Google Analytics (Web Analytics)
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Google Analytics (Web Analytics)를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. AdWords
Marketo
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Marketo를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Marketo 개인정보취급방침
Doubleclick
오토데스크는 Doubleclick가 지원하는 사이트에 디지털 광고를 배포하기 위해 Doubleclick를 이용합니다. 광고는 Doubleclick 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Doubleclick에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Doubleclick에 제공하는 데이터를 사용합니다. Doubleclick 개인정보취급방침
HubSpot
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 HubSpot을 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. HubSpot 개인정보취급방침
Twitter
오토데스크는 Twitter가 지원하는 사이트에 디지털 광고를 배포하기 위해 Twitter를 이용합니다. 광고는 Twitter 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Twitter에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Twitter에 제공하는 데이터를 사용합니다. Twitter 개인정보취급방침
Facebook
오토데스크는 Facebook가 지원하는 사이트에 디지털 광고를 배포하기 위해 Facebook를 이용합니다. 광고는 Facebook 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Facebook에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Facebook에 제공하는 데이터를 사용합니다. Facebook 개인정보취급방침
LinkedIn
오토데스크는 LinkedIn가 지원하는 사이트에 디지털 광고를 배포하기 위해 LinkedIn를 이용합니다. 광고는 LinkedIn 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 LinkedIn에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 LinkedIn에 제공하는 데이터를 사용합니다. LinkedIn 개인정보취급방침
Yahoo! Japan
오토데스크는 Yahoo! Japan가 지원하는 사이트에 디지털 광고를 배포하기 위해 Yahoo! Japan를 이용합니다. 광고는 Yahoo! Japan 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Yahoo! Japan에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Yahoo! Japan에 제공하는 데이터를 사용합니다. Yahoo! Japan 개인정보취급방침
Naver
오토데스크는 Naver가 지원하는 사이트에 디지털 광고를 배포하기 위해 Naver를 이용합니다. 광고는 Naver 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Naver에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Naver에 제공하는 데이터를 사용합니다. Naver 개인정보취급방침
Quantcast
오토데스크는 Quantcast가 지원하는 사이트에 디지털 광고를 배포하기 위해 Quantcast를 이용합니다. 광고는 Quantcast 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Quantcast에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Quantcast에 제공하는 데이터를 사용합니다. Quantcast 개인정보취급방침
Call Tracking
오토데스크는 캠페인을 위해 사용자화된 전화번호를 제공하기 위하여 Call Tracking을 이용합니다. 그렇게 하면 고객이 오토데스크 담당자에게 더욱 빠르게 액세스할 수 있으며, 오토데스크의 성과를 더욱 정확하게 평가하는 데 도움이 됩니다. 제공된 전화번호를 기준으로 사이트에서 고객 행동에 관한 데이터를 수집할 수도 있습니다. Call Tracking 개인정보취급방침
Wunderkind
오토데스크는 Wunderkind가 지원하는 사이트에 디지털 광고를 배포하기 위해 Wunderkind를 이용합니다. 광고는 Wunderkind 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Wunderkind에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Wunderkind에 제공하는 데이터를 사용합니다. Wunderkind 개인정보취급방침
ADC Media
오토데스크는 ADC Media가 지원하는 사이트에 디지털 광고를 배포하기 위해 ADC Media를 이용합니다. 광고는 ADC Media 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 ADC Media에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 ADC Media에 제공하는 데이터를 사용합니다. ADC Media 개인정보취급방침
AgrantSEM
오토데스크는 AgrantSEM가 지원하는 사이트에 디지털 광고를 배포하기 위해 AgrantSEM를 이용합니다. 광고는 AgrantSEM 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 AgrantSEM에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 AgrantSEM에 제공하는 데이터를 사용합니다. AgrantSEM 개인정보취급방침
Bidtellect
오토데스크는 Bidtellect가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bidtellect를 이용합니다. 광고는 Bidtellect 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bidtellect에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bidtellect에 제공하는 데이터를 사용합니다. Bidtellect 개인정보취급방침
Bing
오토데스크는 Bing가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bing를 이용합니다. 광고는 Bing 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bing에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bing에 제공하는 데이터를 사용합니다. Bing 개인정보취급방침
G2Crowd
오토데스크는 G2Crowd가 지원하는 사이트에 디지털 광고를 배포하기 위해 G2Crowd를 이용합니다. 광고는 G2Crowd 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 G2Crowd에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 G2Crowd에 제공하는 데이터를 사용합니다. G2Crowd 개인정보취급방침
NMPI Display
오토데스크는 NMPI Display가 지원하는 사이트에 디지털 광고를 배포하기 위해 NMPI Display를 이용합니다. 광고는 NMPI Display 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 NMPI Display에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 NMPI Display에 제공하는 데이터를 사용합니다. NMPI Display 개인정보취급방침
VK
오토데스크는 VK가 지원하는 사이트에 디지털 광고를 배포하기 위해 VK를 이용합니다. 광고는 VK 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 VK에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 VK에 제공하는 데이터를 사용합니다. VK 개인정보취급방침
Adobe Target
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Adobe Target을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Adobe Target 개인정보취급방침
Google Analytics (Advertising)
오토데스크는 Google Analytics (Advertising)가 지원하는 사이트에 디지털 광고를 배포하기 위해 Google Analytics (Advertising)를 이용합니다. 광고는 Google Analytics (Advertising) 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Google Analytics (Advertising)에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Google Analytics (Advertising)에 제공하는 데이터를 사용합니다. Google Analytics (Advertising) 개인정보취급방침
Trendkite
오토데스크는 Trendkite가 지원하는 사이트에 디지털 광고를 배포하기 위해 Trendkite를 이용합니다. 광고는 Trendkite 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Trendkite에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Trendkite에 제공하는 데이터를 사용합니다. Trendkite 개인정보취급방침
Hotjar
오토데스크는 Hotjar가 지원하는 사이트에 디지털 광고를 배포하기 위해 Hotjar를 이용합니다. 광고는 Hotjar 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Hotjar에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Hotjar에 제공하는 데이터를 사용합니다. Hotjar 개인정보취급방침
6 Sense
오토데스크는 6 Sense가 지원하는 사이트에 디지털 광고를 배포하기 위해 6 Sense를 이용합니다. 광고는 6 Sense 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 6 Sense에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 6 Sense에 제공하는 데이터를 사용합니다. 6 Sense 개인정보취급방침
Terminus
오토데스크는 Terminus가 지원하는 사이트에 디지털 광고를 배포하기 위해 Terminus를 이용합니다. 광고는 Terminus 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Terminus에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Terminus에 제공하는 데이터를 사용합니다. Terminus 개인정보취급방침
StackAdapt
오토데스크는 StackAdapt가 지원하는 사이트에 디지털 광고를 배포하기 위해 StackAdapt를 이용합니다. 광고는 StackAdapt 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 StackAdapt에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 StackAdapt에 제공하는 데이터를 사용합니다. StackAdapt 개인정보취급방침
The Trade Desk
오토데스크는 The Trade Desk가 지원하는 사이트에 디지털 광고를 배포하기 위해 The Trade Desk를 이용합니다. 광고는 The Trade Desk 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 The Trade Desk에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 The Trade Desk에 제공하는 데이터를 사용합니다. The Trade Desk 개인정보취급방침
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

정말 더 적은 온라인 경험을 원하십니까?

오토데스크는 고객 여러분에게 좋은 경험을 드리고 싶습니다. 이전 화면의 범주에 대해 "예"를 선택하셨다면 오토데스크는 고객을 위해 고객 경험을 사용자화하고 향상된 응용프로그램을 제작하기 위해 귀하의 데이터를 수집하고 사용합니다. 언제든지 개인정보 처리방침을 방문해 설정을 변경할 수 있습니다.

고객의 경험. 고객의 선택.

오토데스크는 고객의 개인 정보 보호를 중요시합니다. 오토데스크에서 수집하는 정보는 오토데스크 제품 사용 방법, 고객이 관심을 가질 만한 정보, 오토데스크에서 더욱 뜻깊은 경험을 제공하기 위한 개선 사항을 이해하는 데 도움이 됩니다.

오토데스크에서 고객님께 적합한 경험을 제공해 드리기 위해 고객님의 데이터를 수집하고 사용하도록 허용하시겠습니까?

선택할 수 있는 옵션을 자세히 알아보려면 이 사이트의 개인 정보 설정을 관리해 사용자화된 경험으로 어떤 이점을 얻을 수 있는지 살펴보거나 오토데스크 개인정보 처리방침 정책을 확인해 보십시오.