설명
주요 학습
- Get an inside view of the Porsche visualization and experience design process
- See how Porsche uses VRED to create highly immersive design reviews
- Discover the new possibilities that VRED offers for experiencing UX/UI concepts in the early virtual-car-design process
- Discover how physical hardware models can be interconnected with early digital UX/UI design and visualization content
발표자
- SBSure Bak
- Pascal SeifertPascal Seifert studied design at the Anhalt University of Applied Sciences from 2002 to 2007. He has been working in the automotive-design visualization and virtual-reality domain since 2008, and he has developed a variety of qualities and skills in the whole virtual-product lifecycle process. He possesses expert knowledge in database, handling file conversion and data preparation, and he presents the results in design or immersive engineering environments. Currently, he is the Technical Product Manager for Autodesk VRED and caretaker for automotive customers around the globe, using his design and visualization experience to help during the digital design phase.
[MUSIC PLAYING]
THOMAS GABRIELSEN: So good morning, everybody. Welcome to our session. I'm Thomas Gabrielsen, visualization specialist at Porsche Design Center in [INAUDIBLE]. And I'm really excited to talk to you about what we named creation of the next level immersive design experience at our design center at Porsche.
But I'm not doing this alone. I also want to introduce to you my colleague Sure Bak, who's responsible for the user interface and user experience design, and also Pascal Seifert, the technical product manager of VRED.
So before we cut to the chase, we start with a basic insight of our design process so that you can better understand what we're going to talk about in a few minutes. And I want to start with a very basic but a very important one, which we call our Porsche design DNA. So the thing why this is so important for us is, we always want to assure when we design cars that you are able to get, at the first glance when you look at the cars, that, yeah, this is a Porsche. So we're talking about brand identity and product identity here.
And I just want to explain to you what are the key design features at Porsche. And there you have three examples on the right side. You see, in the front of the car, you always have the fenders higher than the bonnet. So this you will find in every car line we have. Or if you go to the side view, you always will find the Porsche typical flyline, and also the dynamic DLO, DayLight Opening graphic. And if you go to the rear, you always will find those lines coming over the roof, coming together at the end of the car, and the pronunciation of the strong shoulders. So these are basically rules or design features our designers have to be aware of when they design their cars.
And of course, we introduced with the Mission E our vision of electric cars. And we had to therefore electrify the DNA. So like in the nature when we speak about DNA, you also have this kind of evolution process. And we have the same in the car industry, as well. So when the environment changes, you have to adapt to this environment change. So therefore, we electrified the Porsche DNA.
And one of the key features which added there was the feature of aerodynamics, because it plays a big role in the electric car to get the extended range. So it was obvious that we would feature this kind of aerodynamic purposes in our exterior design by having air intakes and showing the air going through the exterior, especially. So you have that in the rear, as well.
And also typical is, for our Porsche cars, the so-called light [? bow. ?] So every car now has this. So you can, at the first glance, recognize, this is a Porsche.
But when our designers actually start designing cars, they have to reconsider the interior, as well, because we have, now, much more screens in the interior. So a really Porsche typical thing is this driver-oriented view when you are driving your cars. You are surrounded by your instruments. And we feature that in the electric cars, with this curved displays.
And also, we have this kind of rising center console in the interior, because you have no gearbox down there, so you can really have it floating. And also, a very typical thing is the impact of the whiteness of the interior and the fact that you start your engine on the left side. So this is basically the setup for the interior rules, as well when we speak about the Porsche DNA.
So when our designers start a new car project, they start with the ideation. And they do it by making sketches, doing it by pen and pencil or with Photoshop or other tools. Even they sketch, sometimes, in 3D tools as well, like Alias or other tools which would be, anyway, the next step in the design process-- going from the ideation to theta. And therefore, you need a very good connection between your designer and your modeler, in this case.
And here, you see just which software we are using. It's mostly Alias in the design studio, but also Maya, for example, when we were going into poly modeling, which is also quite famous for a certain process step at the beginning, to get a volume-based model, because you quickly can then decide proportions or get a feeling for full volume, in that case. But the main tool is Alias right now. And of course, we are also experimenting with generative design in that case. So therefore, we are using Grasshopper.
So then, when you are at the data step, you have to somewhere go physical, because we make physical cars. Then you have to, somewhere in the design process, to be able to periods the design in a physical way, as well. And therefore, we still have clay models.
I know many companies don't use that anymore. We still do because it's important for us. But we, of course, try to make less iterations there and use technology to save a lot of money, in that case. But it's still a really important step in our design process.
And then finally, what our department is doing, visualization. We build up strong VR models, we call it, for every car line. So in our team, every guy is responsible for at least one car line. So I do the Panamera, for example. And we do that completely in VRED.
So those are, essentially, the four steps in our design process. And of course, because it's a rapid prototyping process, you are going through all the things all the time. So when you look at the whole timeline of building a car or constructing a car from the first idea to the final production, the design process actually is right after the concept. And then it goes into serious production and stuff like that.
But what I really show you with this slide here is-- because we are doing something special here. I know other car companies are not doing it-- the visualization we do. We cover the whole car process. And I just want to explain to you why we are doing that, because you can imagine other companies, from my understanding-- they build up many, many visualization models regarding the purpose they need.
So [INAUDIBLE], for example-- the design has their own ergonomics and so on. So you have pretty much many models, but they are not holistic. So they are covering only a small part. And then after that, they are gone. They are dead.
And we are not doing that. We try to be in the process from the start, from the first sketches in Alias, and try to cover the whole development, getting all the data in from the different process steps. And then we get this, like a child grown at the end. And it makes our model really powerful and holistic.
So our ambition-- and this is why we cover the whole process here-- is to be as most realistic as we can get, because our management is really critical. And we have to give them the best opportunity to make decisions based on our renderings or movies or our VR model, in that case.
So our claim here is best in class. And we want to assure that the original idea the designer had in his mind and tried to bring into the sketches and into the data isn't lost somewhere between the visualization. So we are working very strong together with our designers, as well, to get exactly what they had in mind, also, in the visualization. So this is our ambition, here.
Also, we want to make sure that our designers can experience the whole car in motion, too, before even one physical model is done. We give them the opportunity to make some cool clips, movies, or something like that, seeing their car design in an urban environment or a desert or whatever it takes, just to have their design in a context so they get a good impression of how it feels like when it's finally in production. And they can quickly, then, make changes at the early beginning of the design process.
Also, for color and trim, it's a good opportunity to see how color has the impact on a moving car, because this is very different to still images when you have a moving car and have to decide which color you're going to develop in the process.
And the last thing is-- this is a very huge one-- because of the many displays you now have in cars and interfaces, we want to make sure that our designers can experience their whole design in a holistic way as early as possible in the design process so that really early changes can still be made and we can just make our design and the result even better. So therefore, we have a really huge job here, at this point, to give them this opportunity.
So at the end, when we talk about our VR model, we are able to do several things with that. So we can, because we follow the whole process, at the end use it, for example, for press media production because we, after the design process, also add construction data to it. And we are pretty much connected with every department delivering data. So we just all get all the things in. And then we have right away, at the end of the process, a virtual car which can be used right away in VRED with other agencies for production of press, media, and stuff like that.
Also, we can cover the whole light development process by doing light simulations for our [? lighting ?] designers so they can just get, right away, what their designs, especially at night, will look like and if they have to change something with their external partners and stuff like that.
So this is really a bunch of things we can do. And what I will do now is just pick two out of them which are important here, for this presentation. And this is virtual reality, of course. We are using that a lot. Since the HTC Vive is really working out of the box with VRED, we use it, especially, in the very early stage of design because our advanced designers then can quickly evaluate platforms, proportions, and different ideas in the VR glasses-- even for projects who will never made it to their first physical model.
The second would be this experience thing I talk about, because it's getting more and more important. And therefore, we tried in the past adding still images. We worked pretty close together with you guys to get always the latest concepts there. And we also try to display videos like you see here just to get this experience covered. But there, we had some issues, because still images are not working pretty well to get a whole experience.
So HMI, in this point, is very important for us. And I just want to show you a quick overview in the next video you will just see.
[VIDEO PLAYBACK]
[DRAMATIC MUSIC]
SURE BAK: Hi, my name is Sure Bak. And I'm from HMI department. And as you've seen from video, in modern cars, especially in Porsche, we have lots of screens in interior. So having some nice graphics or some very convincing HMI design is really important nowadays.
And we are pretty much in charge of what every screens and graphics in car. So it's not only about the graphics on the display, but some small iconography to some letters that goes back of the car or inside of car. All those graphics are pretty much our jobs.
So let me briefly talk about how the process goes on. So as all designers do, we start from sketch, of course. And after we build up these concepts, we start to think about what kind of UI inventory or UI block can be so that goes to some places and interact with humans. Then we can build up some UI structure out of it.
After we build this rough UI structure, we test out with our prototypes. In this case, we just put our huge iPad in the dashboard and we do some prototypes. And after this kind of assessment, we go into production design. So this is a very rough process of HMI design.
But the benefit of having HMI design as one process is, it goes until the end of car production. Actually, maybe it can be after the production, because by updating software, we can keep changing it. So unlike other designs like exterior, interior designs, we are able to change all the time. That's one great benefit.
So when it comes to tools we use, we use these kind of tools. But maybe they are not old tools. We use some other tools, as well. So production design, of course, we use some sketches and Photoshops, maybe. But for the prototypes, we use Framer, InVision-- maybe some of you heard about before. But to be honest with you, as a prototype, I use text editor all the time. But I wasn't sure which text editor I should bring up, so I didn't show you.
So here's a point we are struggling with. And I want to show some cases. So the car you are seeing here is the Mission E Cross Turismo, which was unveiled roughly six or seven [? months ?] ago. And at the moment, we really wanted to make it realistic as much as possible.
This was just concept car. But unlike other kinds of cars, I don't know which brand. But our Porsche tried to make it really realistic so that we tried to come up with very high fidelity prototype. And we poured lots of efforts in it. But of course, it takes lots of time and lots of money, as well.
Before our UI designers, UX designers, this kind of hi-fi prototype is not always a good thing to do, because they run out of time. They have to try out many screen designs. But if they rely on this only hi-fi prototype, then they cannot go through it. So maybe for them, it's easier to do very quick, rough, and [INAUDIBLE] prototype.
So I don't know how many of you guys have heard about InVision. How many of you knows this InVision prototype? Yeah. This is very simple and rough-- I wouldn't say rough. But this is a screen-based click dummy prototype. So if you have some screens, then you just deploy it and you just define where you want. Just click, and then it goes next, next, next.
So what we want to achieve was this type of prototype so that we can really speed up our process. But here is the problem. As you can see, in a car, there are lots of screens. They are not one single screens, but honestly, they are, you have one for HUD cluster, center, maybe extended one and middle console. Even we have one screen in a door handle. So if I sum up everything, then that's huge amount of screens.
So to give better understanding of this screen [? aesthetic, ?] we have so-called seating buck. seating buck is a kind of interior model where you can put actual displays so that designers can test out how far its screens are and how big the buttons are-- things like that. But here as well, these actual displays are just connected to one computer. So it's pretty much the same as you have your multi-monitor on your desktop.
So again, the problem we had was with the conventional tool like KeyNote or PowerPoint-- I don't know, some image [? previewer-- ?] we cannot really show up our graphic design onto these displays. That's why we started to build up our own tool. So today, I want to talk about the experience engine we developed the last 10 months.
So this tool enables us to bring up these graphics or screen designs onto actual displays. And it really worked out well. And our designers are really happy. And this is kind of presentation tool.
So as all other presentation tool, on the left side, you can see some slides or some scenes. So you can build up your story. For example, when UX designers want to show their experience along this screen based story, they make one slide to show some situations, and then they go next, next slides.
So if we look into the car, you can map each screen to each display. So in this case, we have four screens which map to cluster, center, extended, and [INAUDIBLE] at the middle console. And technically, they are all individual web pages.
So this is wholly designed by HTML5. And simply, they are one browser in one display. So you just bring up one browser and type some URL. Then it goes into the one screen.
And maybe this is one more technical talking-- so communication-wise, if this is web [INAUDIBLE] based and we use MQTT. But in an easier way, I can say that this is publish and subscribe [? model. ?] So experience engine published some signals when they change slides or when they change pictures. Then, automatically, it is responded to the clients-- I mean, the children screens. And it goes backward, too.
So now, we have this fantastic or very good combination with our screen design and presentation tool which fill up the seating back.
AUDIENCE: [INAUDIBLE]
SURE BAK: But here, we have one small problem. Seating back is designed for drivers and co-passengers, actually. So it has only two seats. But sometimes we have to show our screen design or our user experience to many audiences, like today. We have 100 people here, but there is no way to show this content or design to many audiences, because you cannot literally sit on just one small seating back.
And also, there were many possible problems with this combination. But in this point, we really wanted to have virtual interior models so that we can show our HMI design to many audiences. And also, if we have this virtual interior model, then it really makes our life easier, because our designer doesn't need to go to seating back to check their design. They can simply bring up this virtual interior model on their desktop and check their designs right away.
And also, by having these virtual reality model, you can keep change your interior design and HMI content together so that we have up-to-date model all the time. And this was the point where VRED and HTML5 design really worked well.
PASCAL SEIFERT: So my name is Pascal Seifert. I'm working for the VRED product management team. And two and a half years ago, we were looking for possibilities to allow our users to bring in dynamic UI content into VRED and make a connection to VRED so you can have several interactions from VRED into HTML5, and also from HTML5 into VRED. And we had a closer look into the tools that are used by our customers during the process.
And there's a huge variety of tools that is used-- so the tools that Sure mentioned in the beginning, in the design process. But of course, there are also more advanced and engineering tools used, like [INAUDIBLE], Elektrobit. I think QT has also made big progress there. Unity is also used in that process. So there is a huge variety of tools. And they all have their own uniqueness and come with pros and cons, of course.
But there's one thing used during the process-- it's not a tool. It's actually a programming or a scripting language-- HTML5, that is shared between different processes. So starting from an early design click dummy prototype to a more advanced and fully usable UI like Porsche is using it. And this is where we started to start the conversation with Porsche, because we knew that they were pretty advanced in doing that kind of things.
And we talked to Thomas. And he said, hey, you know what? HTML5 is a perfect fit for us, because we use it already in our process. It would probably work out of the box. And we started to look more into the workflow-- so, what do we have to support? What kind of connections they want to make? And so on and so forth.
And it came out that HTML5 is a pretty good tool, because it provides excellent programming semantics. It works on multiple platforms. It supports video, audio, 3D content via WebGL. Styling is awesome, because you can do everything with CSS. And also, the performance is pretty good, also, on low-end machines.
So we looked into that. We talked to the developers. We made our specification. And we came across-- or, the outcome [? was ?] [? smart, ?] but a really good module that allows the user to make the connection between this HTML5 content. Doesn't matter if it's local or it's hosted on a web server.
And the only thing he has to do is typing in the size of his web page he want to display, because we render that into a texture and make a connection to whatever he wants to use it for-- a material, obviously, for HMI. So you can put it on the display. And then we render the HTML5 content using a media editor which is based on the Chrome framework into our [? VRED ?] [? scene. ?]
And a goal was to make it easy to use. So the visualization designers are experts, but they usually don't have time. So it has to work very, very quick to set up these kind of things. And as you can see here-- does it work? So there's not much to do typing in your address or where the content comes from. Assign a material to that module, as well. This is how you make the connection.
So it's not node based. It's a tiny little module. Choose the channel you want to have the texture in. And then you can start using your web page in that example inside VRED.
It also addresses some other needs we had. So we also want to provide a tool that allows, let's say, not experts also the possibility to work with some overlay. UI is for, in that case, on the left side, a design configurator-- but also for bringing up some menus where you can do your settings. I know in gaming, it's done with [INAUDIBLE] a lot, or tools have their own module to set that things up.
And also, menus that can be used in VR-- so like you see here, it's attached to the hand. And there, I can have whatever I want on it. And I can have it look as I want. So styling is super flexible. And this is actually how we closed the triangle between the needs of Porsche, their workflow, and our tool.
THOMAS GABRIELSEN: Yes. This was really a huge impact for us. So even if this tool might appear, maybe, small and for other purposes originally, for us it was this perfect opportunity to get together those three really important things at the early design stage at our company because, like you see it now, with this kind of triangle, we can do multiple things. Like here, for example, you see how we can now control, with the super smart experience engine, our VR model right away, without changing processes or something like that.
And this is the really important point here-- we just connected two existing processes just by this small feature, the media editor. So we just could right away use what the HMI guys or UX guys right away did with the seating back anyway. So this really enabled us to get the whole system portable, in this case, you can imagine.
And also, which is the best case, I think, in this whole scenario, you can use all the technology from VRED now in this kind of combination with the user experience process. So you can, of course, use the HTC Vive and the seating [? button ?] [? now, ?] control it with the experience engine, and have the full holistic experience of your design in the earliest of the design process. And this is really, for us in the car industry, a huge leap forward, because the struggle of getting to a very good UI or UX concept from the very beginning is very high, because you have always to reconsider that environment, like the interior design, is always being changed when you change the UX.
And so this really enabled us to get this whole iteration process really shorter and shorter, to save a lot of money, get better results out of it. And we are using this kind of set up a lot. So if we have the chance to build up the seating back and the whole triangle, we use this a lot, too, for presentation with our management.
But we also, of course, have the case that you bring it, just like here, to any presentation, now, to show the kind of user experience or user interface. You can imagine, we have lots of presentations not in the design studio. But also, we do lots of comparison driver experience sessions on different countries-- the US, also.
And what we can do now is just grab two of the elements. Like we have now, here, the laptop, a VRED scene running on it, experience engine hosting. Then you have this portable set up. And [INAUDIBLE] management show right away, with the HTC Vive, the latest version of your interior design in combination with your user interface. And this is really, really great, because you're not always able to bring the seating [INAUDIBLE] to the US and other countries.
So I pointed out just the advantages, like portability, here. But of course, because it's HTML5, you have the great advantage of the multiplatform feature. When you have this portable setup, you can easily grab any device which has a browser, connect through the experience engine, hosting on the laptop, and just control your whole VRED scene through your device. It could be a mobile phone. It could be a tablet. It should only have a browser.
And of course, with that, we deliver always the latest data which we ever needed. So we just update the VRED model. We just update the HMI thing on the laptop. And then we just ready to go. So this enables us a whole other possibility or flexibility. And the most important thing-- we can make decisions really early in the design process.
[VIDEO PLAYBACK]
THOMAS GABRIELSEN: So this was just to round up, to show you what was then used in our latest show car, the Cross Turismo. We had this whole bunch of things like implementing drone interfaces and stuff like that. And therefore, we already used this kind of new platform we now had with this triangle we showed you before.
So we're pretty much done with our presentation for the first part. Then what we are going to show you now is just a quick live demo of this portable use case we just spoke about. So we can switch right away onto the desktop, showing the VRED scene and the experience engine we had in the presentation. And Sure will just go through some scenarios, in this case.
And I'd like to ask you, when you have questions, you can now ask them right away when we do our live demo.
SURE BAK: A live demo is always risky. But--
[LAUGHTER]
--let me try out. So on the right side, you see the VRED. And on the left side, this is so-called experience engine. So let me briefly just bring it up front. So it's pretty much the same as some graphic tools, maybe, or some presentation tools.
So you can go through this presentation as you wish. It has video or still pictures. But this is really, in real time, connected to VRED. So let me briefly go through it.
So on the left side, you see this graphic cluster. It's directly onto VRED. And also, if you look at closely here, the camera is already mapped. So here, I set [INAUDIBLE] as a camera position. Then it is already mapped to VRED. So VRED has defined cameras. And then it is controlled by experience engine.
So now, we go back to center position. Then we go to driver's position. [INAUDIBLE] again. We changed some contents.
So for example, here, let's say, I don't like this picture. Then I just enter some asset folder. So in this case, this is [INAUDIBLE] So let me go through the [INAUDIBLE]. So I can pick up some other images. I like this image. Then I drop it. Then it changes.
So, next.
THOMAS GABRIELSEN: And you can imagine, on the management presentation, like outside, you would just bring an iPad or a tablet or just use a phone to have the full-screen presentation now running. And then you can just control it by the phone. So it's a really handy tool.
SURE BAK: Because it is designed as a responsive way-- can I-- yeah. Let me simply show you how it is reduced.
So if you open up this one in a mobile phone, it becomes a kind of presentation tool. So it doesn't have any editing feature. But it's simply just presenter. So you can go back and forth.
And even you can turn on the show mode, which means in a given period, it changes its slide automatically. So in case you have this show, then we just turn on this experience engine all the time.
THOMAS GABRIELSEN: For that, we can just go quickly into full screen.
SURE BAK: Yeah. So you don't need to touch anything. But along the story line, it changes automatically.
THOMAS GABRIELSEN: And this is one of the great things you can do with this kind of experience engine now. You can predefine your whole presentation externally. But you are still able to change it right away. Because I know, maybe, there are others from the automotive industry here-- we are always having changes in the last second. So even in or during the presentations, we have to change something. And with this kind of workflow, we are now able to do that very quickly.
SURE BAK: In addition to that, since, as Thomas said, our head of department or our C levels-- they want to give some last-minute change. So we allow them change this reordering slides. Or sometimes, we hide some slides so that they can just skip it in a presentation.
THOMAS GABRIELSEN: And another great feature is, all the connection between these three things we saw is vise-versa. So we can always interact, like in the VRED scene, as well. And then we get the changes back to the experience engine. Like when you click on an interface, then you get the response to the different screens, as well. So then it feels really like having already the user interface concept even in a really early stage.
SURE BAK: Yeah. I think live demo is pretty much done. So we can jump into the Q&A session, maybe.
THOMAS GABRIELSEN: So thank you, in advance, at this point--
SURE BAK: Yeah. Thank you.
THOMAS GABRIELSEN: --for listening. Yeah.
AUDIENCE: Which program was used to create the user interface video [INAUDIBLE]?
SURE BAK: User interface video? Ah, you mean this kind of moving around?
AUDIENCE: [INAUDIBLE]
SURE BAK: Yeah, we use After Effects. Question? Yeah.
AUDIENCE: [INAUDIBLE]
SURE BAK: Sorry?
AUDIENCE: [INAUDIBLE]
THOMAS GABRIELSEN: Sorry, I didn't understand it.
AUDIENCE: [INAUDIBLE]
THOMAS GABRIELSEN: Autodesk what?
AUDIENCE: Shotgun.
THOMAS GABRIELSEN: Shotgun.
SURE BAK: Shotgun. Shotgun.
THOMAS GABRIELSEN: No, not yet. But I'm looking into that, honestly.
SURE BAK: Other questions?
THOMAS GABRIELSEN: There was another one.
SURE BAK: Yeah.
AUDIENCE: Could you talk a bit about [INAUDIBLE] when you actually implement this in reality? [INAUDIBLE] the engineering [INAUDIBLE]? And what can they do to prevent this? And how do you quality control that?
SURE BAK: Quality control. You mean the HMI design.
AUDIENCE: Between the design and the fabrication.
SURE BAK: Yeah. We have external company which builds up our HMI design into the production. So after we finish our design, we keep contacting them. And they give us some feedback. And also, we have a chance to see what happens in an actual display.
So actually, this part is a little bit different from our seating back. Seating back is kind of actual display, but it is not exactly the same as production. But still, the resolution and all the dimensions are the same. But externally, our company-- they build these real production things.
Then we, time to time, visit them. Or they come to us and they change the display module so that we see how color is correction and how animation goes well. So we check it every now and then.
THOMAS GABRIELSEN: And I think the goal in this point is, because now, with this tools, we can show the other department, the electric department, what we have in mind when we speak about the whole car design with the interface. They visit us or we visit them, show them what we are designing right away. And then they have a really good basis for talking with external agencies.
So you don't have that huge cut between your original design and the serious production version. So we want to push the process a little bit forward into the production direction so you already have said what you need-- the concept is ready to just go into the production version.
SURE BAK: I think [? in the ?] [? lab, ?] they might miss one thing. So as I told, you this is kind of publish and subscribe model. But it's not always one direction. It's both directions. So in VRED-- actually, this is HTML page. So if I click on it, then it goes to next scene. So for example, here, I can click on this part and it comes up like this. You can just click again. So in a virtual reality, if you have HTC Vive, then you can maybe click on some screens. And it react. That's a really cool thing.
THOMAS GABRIELSEN: Yes?
AUDIENCE: [INAUDIBLE] so who wrote the experience and who wrote that software?
[LAUGHTER]
[INAUDIBLE]
SURE BAK: Yeah. Yeah.
AUDIENCE: That's impressive. I know you could do a lot of that [INAUDIBLE].
SURE BAK: Thanks.
[LAUGHTER]
AUDIENCE: Yeah. [INAUDIBLE] So [INAUDIBLE] VRED. Nobody does that. [INAUDIBLE]
THOMAS GABRIELSEN: Yes. I think the software alone-- it's handy for this kind of usage, but it has whole different other potential use cases you can do with that, because you have the storytelling. And it's easy to handle. Every designer can do that. And you can just use it for other purposes, as well, wherever you have multiple screens and have to synchronize them.
SURE BAK: We sometimes talk, as a half-joke-- in our canteen, they have lots of screens to show off their menus. But they really have no idea how to show these screens in canteen displays. So maybe that's one opportunity we can sell this product.
THOMAS GABRIELSEN: So right now, they use USB sticks.
[LAUGHTER]
For every screen.
Any other questions? No?
So, thank you. Thank you very much for your attention. I hope you had some fun and get some mileage out of it. And it was really a pleasure being here. And just have a nice day. Enjoy the other sessions.
SURE BAK: Thank you.
PASCAL SEIFERT: Thank you.
[APPLAUSE]