Description
Key Learnings
- Learn about the 3D-printing processes used to construct the MX3D and Dar Smart Bridges.
- Learn about options for instrumenting smart infrastructure with sensor technology and time-series databases.
- See how Autodesk Platform Services can be used to build digital twins displaying sensor data in the context of 3D geometry.
- Learn about opportunities and challenges afforded from building and adopting smart infrastructure.
Speakers
- Kean WalmsleyKean Walmsley is the director of Systems Design/Architecture Engineering, focused on the research area of human-centric building design. He has previously worked on projects exploring the integration of IoT data with BIM (Digital Twins) using Autodesk Platform Services, as well as Generative Design in the AEC space. He has worked in various roles – and in various countries – during his career at Autodesk, including building and managing teams of software developers in Europe, the Americas, and Asia/Pacific. Kean engages regularly with Autodesk’s developer and computational design communities, providing technical content and insights into technology evolution.
- PSPeter StoreyPeter Storey is a Principal Research Engineer in Autodesk Research's Industrialized Construction team, and a Chartered Engineer with a Master's degree in Mechanical Engineering from The University of Nottingham, UK. He joined Autodesk in 2015 as a Graduate Applications Engineer. For the following 3 years, he worked in the Birmingham Technology Centre, using PowerMill, Fusion 360 and other tools for unique manufacturing projects using Kuka robot arms. Moving to Autodesk Research in 2019, his role focuses on delivering collaborative projects, working with customers and across organisations to apply technologies such as industrial robotics, generative design, and additive manufacturing on a construction scale.
KEAN WALMSLEY: Welcome to this session on monitoring and visualizing data from the MX3D and Dar smart bridges. So I'm Kean Walmsley, the senior manager and software architect at Autodesk Research, and I'm joined by my colleague Pete Storey, who's a principal research engineer with Autodesk Research. So let's take a look at the agenda for this session.
We're going to introduce ourselves in a bit more detail and Autodesk Research, and then talk about a project that's been instrumental in implementing our work on these bridges, Project Dasher. And then we'll go into the individual bridge projects before concluding at the end.
So with the introductions, firstly, so my name is Kean Walmsley. I'm working for Autodesk Research based in Switzerland. I'm leading an effort called Human Centric Building Design that Pete will talk to you a little bit more about, and I am available on LinkedIn at @keanw, as well on X or Twitter. And I have a blog at keanw.com where I share a fair amount of information.
I've been with Autodesk for quite a long time, since the mid 90s. Worked in a number of different roles, a number of different countries, but I've been in Research now since 2016. And over to Pete.
PETER STOREY: Yeah, hi, folks. I'm Peter Storey, research engineer at Autodesk Research and based in the Netherlands. My background is in manufacturing actually, so I have a master's degree in mechanical engineering. And I joined Autodesk right out of university as a graduate back in 2015, where I worked in our Birmingham Technology Center in the UK using our CNC machines, 3D printers, industrial robots, and so on, on various manufacturing tasks.
I then moved to our research organization in 2019, and in recent years, I've been applying my background, my knowledge in manufacturing, and particularly additive manufacturing, to construction scale applications. So Kean, myself, and our whole organization ground our research in three-- sorry, excuse me, we ground our research in eight research themes.
So Kean's work centers around human centric building design. This theme is about realizing a future where designers have the tools to predict how their designs-- or how their design decisions will impact the wellness and productivity of its occupants, so just as they can predict the appearance or structural performance of buildings today.
Industrialization of construction is primarily where my focus is, and the future that we imagine is one where buildings are constructed more efficiently by bringing in manufacturing methods such as prefabrication, automation, robotics, and applying them to the built environment. So hopefully, you'll see the link to these two themes as we go through this presentation.
So before we get into the presentation proper, let's take a moment to ask why we would want and need smart infrastructure. So the first three on the screen are probably familiar. Everyone wants safer, better maintained, more efficient built environment, and sensors can help detect problems early, which can help with preventing catastrophic failure and also warn us of other risks that might be a risk to public health.
Predictive maintenance so sensors can monitor structural issues, but also usage statistics and climate conditions that can all help in predicting if we would expect accelerated wear and tear, let's say. Maintenance cycles can then shift on how much or indeed how little a structure is being used. And that, of course, relates to the efficiency of material and labor, but sensors can also help us in efficiency of operating our infrastructure.
So in a building, for instance, lights can be turned off, AC can be turned down, when fewer people are using the space. But then what do we do with all this data that we've collected? And now, this is the innovation piece.
So ideally, we learn from it and apply it in optimizing our designs and fabrication for future infrastructure. This could be more efficient designs that use less material, sure, but it could also be in understanding how people use the space and creating designs that optimize for their well-being and happiness. And that's very much linked to our human centric building design research theme. So speaking of, I think that's a good time to hand back to you, Kean.
KEAN WALMSLEY: Thanks, Pete. So let's talk a little bit about Project Dasher, so this is a tool-- a project that started back in late 2009, the beginning of 2010, and it was really based on this idea of what were the possibilities around taking data that's coming from sensors, so IoT sensors or sensors inside buildings, and integrate them into a 3D context. What are the benefits of doing so?
So this is an image of a floor inside a building where we're sort of seeing data coming from sensors that are inside these buildings. You can see a number of different plots that are showing-- there's two plots showing CO2 and temperature levels. We have some tooltips that are showing some data coming from a particular sensor inside this space.
We have 3D heatmaps showing the presence of people inside space, and then there's also this sort of environment, which allows us to choose between different sensor reading-- sensor types and the types by level diagram there. So this is a quick overview in terms of some of the features. We have a timeline at the bottom that allows you to select a particular historical time period.
Now, when we started this project back in 2009, it was a desktop application, so it was a complicated system with its own graphics system, with its own installer, et cetera. When I joined Research in 2016, I was part of a small team that decided to take this research from desktop to the cloud, so we started using what was, at the time, the Forge platform, it's now called Autodesk Platform Services, to effectively build out something similar. So using the same back end database, we're able to visualize the same data inside a web browser.
So move forward to a few years ago, we decided to instrument a building of ours inside San Francisco. So this is our Pier 9 office in San Francisco on the waterfront. Now, we decided to instrument this particular office because, within this office, we have a raised pedestrian walkway linking two parts of the building. So on this walkway, we placed a number of sensors, so strain gauges, accelerometers. We have some cameras set up at either end.
Now, one of the reasons why we wanted to test out this particular bridge, or to instrument this bridge, was that, really, we wanted to test whether our backend systems could cope with ingesting the volumes of data that are important for infrastructure. Prior to this, we'd been working with buildings, which would have sensor readings every five or 15 minutes, whereas here, we were potentially having up to several thousand readings per second, depending on the frequency.
One thing we did was to correlate the data from different views, so we have the timeline at the bottom, and we're scrubbing through data for a particular accelerometer. But that also affects the video feed that we have showing what's happening on the bridge at that time. So this really allows us to contextualize this data and understand the outside factors that might be impacting what is happening with the bridge and why it's performing in the way it is.
Now, this is clearly interesting and useful, but there were some downsides to having video data shown in that way. We had to limit the number of people who could access the system because, of course, for privacy concerns. So that's when we embarked on some research into how might we anonymize the views of people inside the video footage, so we started with blurring faces and pulling out bounding boxes.
And we moved into looking at extracting skeletons and mapping those into a 3D environment so that, ultimately, we could visualize those skeletons back in the 3D context, so we could really see where people were on the bridge without necessarily knowing who those people were.
And this was really in support of our first main infrastructure project outside of the Pier 9 office, which was the MX3D bridge that was being built at that time in the Netherlands, ready to be installed across a canal in Amsterdam. Now, the original vision for this particular bridge was that there'd be welding robots printing their way across the canal, essentially building the bridge in situ.
Now, this wasn't realistic, of course. It was really a vision for lots of reasons, but just having unattended welding robots in a public space is not a very good idea for lots of reasons. So the concept shifted of course, to be more inside a warehouse, and this is in the NDSM Wharf region of Amsterdam.
So again, this is still a conceptual view here. At the time, there was still a desire to use a generative design for this particular bridge in the end, given the fact it morphed into more of an art project, ultimately. There was a decision that was made to go with a more aesthetic design.
The bridge was constructed inside that particular warehouse. You can see there's a robot welding. In most cases, I think that it's more representative to think of the bridge as having being printed in sections with the robot welding horizontally rather than on a vertical plane, but this gives an idea of how the bridge was constructed effectively.
Now, as the bridge is ultimately being-- it was being made of a brand new construction material. Now, steel is not new, but the way it was being deposited using wire arc additive manufacturing meant that we didn't really understand the structural properties of this material. So we had a number of consortium members in working with them, and [INAUDIBLE] Imperial College in the UK did a lot of structural testing on the bridge. This was us jumping up and down. There's also some folks from [INAUDIBLE] as well who obviously involved in the engineering work.
We were, in this case, jumping up and down on the bridge having put several tons of load on the bridge prior to this, so we didn't have any real concern about it collapsing. This was just a bit of fun more than anything, and it was interesting to see that it would displace maybe two centimeters if we jumped at the same time.
Now, given the fact that this is a brand new construction material effectively, the decision was made to go and integrate a system of sensors into the bridge, so this is the kind of network diagram for the sensors that were integrated. We have load cells at each corner. We have inclinometers, accelerometers, strain gauges, so there's a number of additional sensors related to the ambient temperature to understand what the weather was like, et cetera.
Now, this system was actually integrated somewhat as an afterthought, in the sense it was not designed in initially. So what it did mean is that we had to spend quite a lot of time as a project team, and with other members of the consortium as well, essentially running cables beneath the bridge, attaching sensors, soldering sensor kits as well. There's a lot of low level work that was needed to make this happen.
This is Alex [? Teissier, ?] who's one of the main people leading the project over the years. So everybody rolled their sleeves up. You can see, above Alex's head, there's an HBM module, which is effectively a data acquisition module, where it would-- the sensors would connect. The cabling from the sensors connected to the HBM module, and then the data from there would be sent onwards, and we'll talk a little bit about that in the next slide.
So this is effectively a system architecture for the bridge, really the bridge operating system, if you want to think about it like that, but it was a similar architecture for our three bridge projects. So it was initially with MX3D, but then the two other Dar bridge projects that we'll talk about, or Pete will talk about shortly.
So first of all, we have a number of types of sensors we have these on the left so load cells and strain gauges, accelerometers, inclinometers, et cetera. Now, we had a broader set of sensors for the MX3D project, but they were all feeding through an HBM data acquisition module, as I mentioned. These would then go to a local PC, and effectively, the data would get sent to the cloud to our time series database. We called this, internally, Data 360.
Now, at the time when we started this project, there was no commercial cloud based time series database that met our needs, especially with the frequency of data that we were collecting, which was a significant amount of data. So we ended up building our own, building several of them, in fact, but the latest iteration is the one that was used.
Now, some other notes about the data acquisition, so we, as I mentioned for MX3D, we use HBM. For the Dar project, we also used FiSens, and again, Pete will talk more about this. But even for some of the strain gauges, we were using a Raspberry Pi, so it really depended on the type of sensors.
But ultimately, the data was all being fed through into the cloud, and then there's some downstream visualization of that data, either using Dasher or, ultimately, being analyzed in order to understand the behavior of the bridge, so that's less visualization. That's more analysis.
So let's take a look at some of the types of data that we'll be getting off the bridge and how they'll be visualized. So this was an early debugging effort, if you like, to understand, well, we have this camera data where we're anonymizing it via our computer vision pipeline, where we're pulling out these anonymous skeletons. So if you look closely, on the left hand side here of the video, this is Caspar, who's an employee at MX3D and the project manager.
Well, this is him being mapped pretty well into an anonymous skeleton, but if we flipped the view of the bridge and look at the-- effectively look at the 3D view from the other angle and hit Play on the timeline, we can actually see that, for example, this person who's sitting on the handrail there is actually flipped from the skeleton. So we had an issue with the homography of the transformation matrix for that particular camera. So the individual skeletons were in the right location, but they were just flipped, so that's something we had to figure out.
But this was just an early attempt, so this was data that was collected during Dutch Design Week in Eindhoven in 2018, which is when the bridge was effectively unveiled for the first time to the public.
The goal of all this is, ultimately, to be able to visualize skeleton data alongside other types of data, so putting it in context, so being able to use animated heatmaps to understand, well, how are the movement of people across the bridge impacting the structural performance of the bridge? How is the bridge behaving when it's loaded with people?
So again, this was an early precalibrated visualization of the data, which we're sort of trying to figure out what we were seeing. But this is, once again, data from Dutch Design Week.
So moving on to looking at how the bridge overall looks inside Dasher, so this is the visualization of the bridge inside the Forge Viewer. We can bring up the sensors for the bridge both in terms of the sensor dots and then the sensor list, where we can filter them based on certain criteria. In this case, we look for sensors that have "Channel" in their name.
As we hover over them, we see a tooltip which shows the type of sensor as well as the latest value that's come from that particular sensor. We can turn on surface shading so that particular data type gets shaded onto the surface of the bridge, whether it's acceleration, strain, temperature, or-- yeah, I think those are the main ones.
So this is the visualization. Then we can, as we hit Play, we can also turn on the skeleton view. We can hover over tooltips to start with. You see these get animated. If we had graphs of data being shown at the same time, those would be animated as well to make sure we could see how the data is evolving over time. And then we're just turning on the skeletons and, effectively, watching people on the bridge as we're-- yeah, as the timestamp moves on.
So that's effectively the latest state of the visualization system for Dasher. A little bit more about the project, the bridge was installed in the summer of 2021. This is across a canal in the very oldest part of Amsterdam. It's actually bang in the red light district of Amsterdam.
It was opened by Queen Maxima of the Netherlands. I would like to say she cut the ribbon, but actually, she pressed a button on a robot to have it cut the ribbon, which is somewhat appropriate considering the use of robotics for the project in general. I was actually the only member of the project team who was lucky enough to get there because I was based in Europe, or am based in Europe, and so this is the back of my head, for whatever that's worth.
It is also worth noting that the bridge is being decommissioned at the moment, and given that there was-- we struggled to get the permissions to ultimately install the cameras. There would have been two cameras installed here and there, so there was some long ongoing discussions with the municipality.
But now the bridge is being decommissioned and moved to another location in the Netherlands. We're actually hoping, especially if it's on a university campus, that we can-- effectively, the project can continue, and the cameras can be installed in order to use that kind of data to understand the bridge's performance. And with that, I will hand it back over to Pete.
PETER STOREY: Thanks, again. So yeah, that leads into the Dar bridge project. So for those who aren't aware-- for those who aren't aware, Dar are a global consulting organization in the AC industry. We worked together with Dar architects and engineers for a few years, starting kind of close to the end of the MX3D bridge project, to explore the use of generative design, robot based additive manufacturing, sensor technology, and also machine learning to imagine new ways of designing and manufacturing civil infrastructure. So you can see the overlap with these projects.
In this presentation, I'm going to focus on the manufacturing, so the process of seamlessly installing sensors, and specifically on the strain sensors we use because they are the most critical ones to place in the right locations. So our collaboration with Dar was showcased with two physical artifacts, the first of which was the two meter bridge.
So here is that completed 2 meter bridge from 2021. Now, you might argue that's a little small to be a true bridge, but the intent with this stage of the project was to test a range of concepts that we will then apply on a larger scale. So there are 88 strain sensors in this photo, and if we did things right, I'm hoping you'll find it impossible to point to any of them. I think the only hint that I can see that this is a smart bridge is the little blue cable in the top left hand corner. So yeah, let's talk about how we designed and installed these sensors.
So the structure was created using Fusion 360 generative design with manufacturing constraints that consider the orientation for printing. So here, we're printing this part upside down, so the geometry was created to respect the overhang angles that would be achievable through additive manufacture. So if you worked in 3D printing, you'll know that it's always nice to reduce the amount of support structures that you use, and this ensures that the entire structure is manufacturable in one go. But that presents some challenges for installing sensors.
So with the design created, we first asked where do we place the strain sensors to extract the most relevant and interesting data. So on screen is a top down view of the structural simulation results overlaid on the model, so the top of the deck experiences mostly compression whereas the bottom of the deck experiences, I guess, mostly tension. So we wanted sensors close to both the top and bottom surfaces to capture this data.
So our structural engineer at Dar selected locations they wanted to monitor, and these are not necessarily the areas where the highest strain is expected, but also areas that might indicate fatigue is starting to occur. So this is a lot of sensors. If we were to apply strain gauges to each of these locations, each needing at least a four core cable going to them, then that would be a lot of wires, which would, I guess, kind of ruin the generative design aesthetic. So we decided to use these so-called-- well, these are called Fiber Bragg Grating, or FBG sensors.
These are just a glass fiber, roughly the width of a human hair, with small gratings etched at our chosen locations along their length. So a very short explanation, we shine full spectrum visible light down them, and then these gratings reflect a certain wavelength of light that can then be correlated to the strain that the fiber is experiencing at that location, and then this allows for 20 or more sensor locations on a single glass fiber. If you can't tell, I think these things are brilliant. I love them.
So now, we can go from this map of sensor locations and reach all of them with one or two FBG sensors on each layer. You can see, we created a path that would ensure that the fiber would align with the direction we expected the principal strain to be in, so especially on the bottom of the deck, in the middle of the bridge, you can see that wiggly line, which allows the fiber to be in the plane of principal strain. And of course, these are custom made FBG sensors with correct spacing we needed for this design.
This is the full view of the assembly. So when it came to fabrication, we printed the first few layers. We then machined a slot into the printed part and laid the sensors in with some adhesive and then printed over the top, and then repeated this process until we had a completed bridge. So yeah, I'll show you some videos now of us going about that process.
So this bridge was made in our Birmingham Technology Center in the UK, and this is the manufacturing process. So we printed the first few layers, and then used one of our robots to machine a slot for the sensor fiber to sit in. You can actually see in this image that this is the second set of sensors. You can see the blue cables from a previous layer where we've already inserted these sensors.
And then we laid the fibers, the sensors, into the channel, glued the sensors in place, and then printed over the top. You can see myself and my colleague here, James, going around and gluing them in place. This process was pretty fiddly, so we really simplified this job for the five meter bridge for the next stage. And you can see there, we continue printing pretty quickly afterwards.
And here's a short time lapse of the remainder of the print, so I'll also point out, in this video, you can see this black unit that's attached to the robot head to the right of the red extruder. We were using sensors as well to monitor the temperature of each layer of the print and using that to adjust and pause the print accordingly.
After printing, we machine slots in the branches as well and installed sensors in the same way. So on the left, you can see, we use some masking tape just to hold the fiber in place while we went in with epoxy to fix the grading in place, and then on the right, you can see the fully sealed channel. So at this point, all the sensors are installed, so it's time to hand over to Kean to talk about the monitoring process.
KEAN WALMSLEY: Thanks, Pete. So here is the bridge inside Dasher. Effectively, we're loading in a period of several seconds, I guess it's nearly a minute worth of data. You can see the sensor locations, the 88 locations as Pete mentioned.
One of the things we did for this particular project was to modify the color of the dots where the sensors are located to indicate the latest value. And when we actually turn on the heatmap, where we can see this volumetric heat map that is representing the forces or the stresses of those particular locations, we can see those colors are mapped into the heat map as well.
This was, yeah, quite interesting. You can see there's sort of a little bit of a rotational effect going on there maybe. Anyway, but this was the visualization of the data that we were able to capture from the two meter bridge. And I'll hand back to Pete for the five meter.
PETER STOREY: So yeah, on to the second stage of our project, the five meter bridge. So our ultimate vision was to achieve this story of what if a robot could print its way across a river without any human intervention, a bridge one click away. Now, this is, yeah, an aspirational goal, as Kean explained, one that MX3d also shared, but it's a goal that we were creeping closer towards as we were developing our understanding of large scale 3D printing. And at the end of the day, we learn a lot by setting out with these ambitious visions. That's what we're here to do in research.
So now, we wanted to print-- or manufacture this as if we were printing in situ across a river. This visualization is one of the generative design results, again, I really enjoy seeing these views myself, this time designed to be printed from left to right. On the screen, you can see the left hand side is kind of the base plate that it prints on.
And again, we performed structural simulation to find locations of interest for installing sensors, and then for this bridge, we placed 60 strain sensors on the five meter bridge. In the branches, we put sensors where we expected the peak in stress to be, and then in the bridge deck, we actually distributed the sensors more evenly across the surface, which would give us quite a nice visualization of where people were standing and walking.
And it's super easy to create channels in the deck using a robot to just mill those channels, but much more difficult to define a toolpath to create a channel in those branches. So we needed to switch things up a little bit, so yeah, in the deck, we used three FBG sensors, each with, I think it was 15 sensors on each, and then used individual strain gauges in the branches. So yeah, each of those locations was its own individual strain gauge, which would need its own cable to reach it.
So we experimented with how we would hide the cables for the strain gauges because, yeah, there was going to be a lot of them. What we did is we printed a few layers, and then machined a slot. At the top right hand image, you can see where we machined the slot to allow the cables to enter into the internal structure, and we then pulled the cables inside. Then our plan was to incrementally pull the cables through the internal structure and have an exit point close to each of the strain gauge locations.
We knew this was a tall order, and as you can see, we ended up pulling all the cables out at pretty much the first opportunity so that we could then continue printing in earnest. So yeah, as I say, we printed the rest of the bridge. This is a time lapse of that whole process. We actually had some milling processes interleaved, and then here you go. There's our colleagues [INAUDIBLE] and James looking very nervous about the printing of the bridge. This is printed at Boston Technology Center and made possible by some of our amazing Research colleagues there.
So the five meter bridge was completed in August last year. We then installed all the strain gauges and FBG sensors. The Boston-- excuse me, the Boston Tech Center fabricated this, top right hand corner, this awesome frame for shipping, and then we sent it to AU 2022. In the bottom right, you can see Haason Zane from Dar on the bridge with a view of the center dashboard showing live sensor data, which means, yeah, it's another good time to hand back over to you, Kean.
KEAN WALMSLEY: Thanks, Pete. So this is the bridge on display at Autodesk University 2022 in New Orleans. You can see me sort of walking across it. There were some interesting challenges around the-- that we hit when we were installing the bridge and getting everything working.
One of the things there, so we're actually using a different approach for sending the data to the live-- live to that particular browser. So we're actually using websockets. Rather than sending the data to the cloud, it's being transmitted locally, and we had some interesting issues around the fact that we had data coming from two different types of sensors and two different subsystems, effectively. We were able to make it work, but there were some interesting challenges that came along with that.
And overall, it was very popular. We had a lot of people come by and visit the exhibit hall and take a look at the bridge. For health and safety, it was only project members who were able to walk on the bridge, so Pete and I spent a lot of time on that bridge during the course of AU last year.
So in conclusion, we've talked today about three smart bridge projects. It's worth noting, and I think Pete has referred to the fact that two of these bridges were printed at technology centers, both in the UK and then in the US in Boston. I've also mentioned Pier 9, which is another of our technology centers in San Francisco.
So actually, a lot of the research and the work that we've done has really been enabled by this infrastructure that Autodesk has in place with these worldwide technology centers. But in particular, with these three smart bridge projects, there was a strong focus on generative design both in the early days of the MX3D project, but then all the way through with the Dar bridge projects, and of course, additive manufacture, whether printing from steel with using wire arc additive manufacturing or using the different materials, plastic based materials, that we used for the later bridges.
There's also a very strong focus on data capture, storage, and then analysis of this data. So we were just one member of the project consortium on the MX3D bridge, as I've mentioned. Our focus was very much on helping install the sensor infrastructure, but mostly on the data storage and then the visualization of that data.
But we had academic partners who were involved in effectively taking that data and analyzing it for different reasons. One was taking a look at the harmonic frequency of the bridge. Is that something that they can tell from the strain data, et cetera?
Another was, very interesting, was taking the load cell data, so just these four sensors that were placed at the corners of the bridge, and then trying to understand, based on that data, where people were on the bridge, which was very challenging and very interesting. Ultimately, the fact that we didn't have cameras drove a certain innovation in terms of trying to see whether we could understand that from other types of sensor, and it was very successful.
So what is next? Well, we do expect to see artificial intelligence and machine learning playing a greater role in these projects. So they have played some role in these projects, but certainly, the amount of data that is being captured definitely lends itself to being fed into some kind of AI system.
We have ongoing projects around digital twins inside Autodesk Research, so one of the research themes that Pete showed earlier on that we didn't necessarily talk about is living product design and manufacture, and they're looking at different feedback loops to take data from the use of a particular product, not necessarily a bridge, but some other kind of product, and look at how that data can be fed back into the design phase in order to iterate and to create better designs over time.
This is a vision that Autodesk Research has had for many years, going back to a project called Primordial. But they're also looking at capturing data, as Pete mentioned earlier that we've been capturing data during the manufacturing process, so using sensors on the nozzles of the extruders to understand or to look for when issues might occur. But that data may also then be fed into the design phase as well to understand well how might we adjust the design to make it more manufacturable in the future.
So that is it for our presentation today. Just to wrap up, if you'd like to connect with the work being done at Autodesk Research, you can visit the website research.autodesk.com. You can follow @adeskresearch on X or Twitter, and connect with Autodesk Research on LinkedIn as well.
My email address is there along with Pete's should you want to follow up with any questions. We'd be happy to hear from you. And thank you very much, and we hope you've enjoyed this presentation.
Downloads
Tags
Product | |
Industries | |
Topics |