Shaping a Path to Offroad Autonomy Using MATLAB - MATLAB & Simulink
Video Player is loading.
Current Time 0:00
Duration 42:26
Loaded: 0.09%
Stream Type LIVE
Remaining Time 42:26
 
1x
  • Chapters
  • descriptions off, selected
  • captions off, selected
      Video length is 42:26

      Shaping a Path to Offroad Autonomy Using MATLAB

      Overview

      Recent technological progress is making it easier for heavy equipment makers to embrace autonomy. However, creating autonomous offroad vehicles remains complex. They need to operate reliably in rugged environments and perform challenging tasks. Collaboration among many teams with diverse skills, such as perception and motion planning, is essential. Testing hardware in harsh conditions is costly and risky, making simulations crucial for reducing risks, refining algorithms, and enhancing performance.

      This webinar offers an in-depth exploration into the realm of autonomous off-road systems, from construction sites to open-pit mines, showcasing how MathWorks' MATLAB and Simulink facilitate the journey toward autonomy. Through detailed explorations of physical modeling, 3D simulations, offroad navigation, and excavator motion planning, attendees will learn how an integrated simulation workflow can enhance cross-disciplinary collaboration and streamline the design, simulation, testing, and deployment of off-road autonomy applications. We will highlight MathWorks' latest tools and features, empowering attendees with the knowledge to efficiently tackle the challenges of offroad vehicle autonomy. Join us to discover how these advancements can transform your projects and lead to more efficient and innovative solutions in the field of autonomous off-road vehicles.

      Highlights

      Through two key workflows – earth moving and transport, we will cover:

      • Physical modeling and dynamic simulation of offroad vehicles.
      • Creating 3D photorealistic scenario simulation for virtual testing of autonomous offroad vehicles.
      • Advanced manipulation of offroad heavy equipment with collision avoidance.
      • Using CAD models as single source of truth for the above tasks.
      • Path planning on uneven terrain.

      We will also highlight various reference examples from construction sites to open-pit mines.

      About the Presenters 

      YJ Lim is the principal manager of robotics and autonomous systems products at MathWorks. With over 25 years of experience in robotics and autonomous systems, YJ's responsibility in MathWorks includes long-term strategy development and product management of robotics and autonomous systems. YJ previously held roles as a senior project manager at Vecna Robotics, Chief Innovation & Operating Officer at Hstar Technologies, and contributed to key software technologies at Energid, SimQuest, and GM R&D Center. YJ earned his Ph.D. in Mechanical Engineering from Rensselaer Polytechnic Institute and his Master's from the Korea Advanced Institute of Science & Technology.

      Christoph Kammer is a senior application engineer at MathWorks in Switzerland. He supports customers in the robotics and autonomous systems domain in the areas of control and optimization, virtual scenario simulation and digital twinning. Christoph has a master’s degree in Mechanical Engineering from ETH Zürich and a Ph.D. in Electrical Engineering from EPFL, where he specialized in control design and the control and modelling of electromechanical systems and power systems.

      Cameron Stabile is a senior developer for the Navigation Toolbox working out of the US Lakeside office. He has supported and authored features such as the 2D and 3D map objects, collision-checking, highway planning tools, and search/sample/control-based planners. More recently he has been focused on offroad navigation applications, resulting in the open-pit mine reference application. Cameron has a master’s degree in Mechanical Engineering from Carnegie Mellon where he focused on navigation and motion planning for tracked AGVs & robotic manipulators for the purpose of autonomous shipbuilding.

      Recorded: 15 May 2024

      Hello, everyone. Welcome to our webinar and thank you for joining us today. My name is YJ Lim. I am the technical product manager at MathWorks. In industries like construction, mining, and agriculture, autonomy and intelligence are increasingly crucial for ensuring safety on the job, boosting productivity. However, creating autonomous systems is a complex challenge that demands a wide range of skill from physical modeling and sensor to perception and control.

      Today, our topic is shaping a path to offer autonomy using MATLAB. Joining me in this presentation will be my colleagues, Christoph and Cameron. Here is a quick introduction of us. I am the Principal Technical Product Manager of the Robotics and Autonomous System at MathWorks located in Natick, Massachusetts. I have dedicated over 20 years to the development of robotics and autonomous systems. Before my time at MathWorks, I was involved in various robotics project at the companies such as Vecna Robotics, HStar, Energy, and GM R&D Center.

      My main area of expertise lies in cobot, specifically those utilizing compliant actuator and autonomous mobile robot. I have been with MathWorks for 6 years overseeing a few robotics product. Now, I will hand it over to Christopher for his introduction.

      So, yes, thank you very much, YJ. So my name is Christoph Kammer. I'm an Application Engineer at The MathWorks in Switzerland for five years now and I've been focusing a lot on simulation frameworks for autonomous systems, co-simulation with other tools and topics like this, and I also have a PhD in controls. So that's another big hobby of mine. On to Cameron.

      Thank you, Christoph. My name is Cameron Stabile. I'm a Senior Developer working in the Navigation Toolbox. As a background, I have a focus in autonomous mobile robots. I've worked on tracked robots and autonomous welding in grad school. And since coming to MathWorks, I've been focused on motion-- path and motion planners as well as environment representations used for those problems. And I will later talk about how to build an autonomous navigation stack for a haul truck in a pit mine.

      Thank you, Cameron. All right, here is what we want to discuss today. I will kick things off by discussing about some of the current industry trends and challenges. Following that, Christoph and Cameron will take you to design, simulate, and test off road vehicle.

      They will highlight two specific use cases. First, the earthmoving workflow over excavator. And second, the navigation workflow for pit mining truck. Finally, I will wrap up our webinar with some call to actions. So let's get started with an introduction.

      In this webinar, when we talk about off road vehicle, we are referring to those used in the construction, mining, and agriculture sector. These industry are undergoing a significant transformation driven by increasing adoption of autonomy and intelligence. Let's look at some real world examples from our customers.

      First up, Sumitomo Heavy Industry uses the modular design with Simulink to develop an embedded MPC for hydraulic actuator. Caterpillar develop a big data infrastructure, which is a database for searching and retrieving labeled ground truth from construction site. Then, CNH industry has developed an intelligent harvest using 3D camera and automated control systems.

      So these examples explain how today's smart off-road vehicles are integrating autonomy and intelligence to optimize for processes, ensure safety, and enhance sustainability. They are integrating advanced machinery and technology. Furthermore, the integration of the AI, big data, and IoT is transforming off-road vehicle operation.

      Additionally, there are growing trend of increasing connectivity between component and users. Another significant aspect is the adoption of digital transformation at every page of the research and development and operations. So this slide summarizes the trend of previously discussed. Imagine autonomous off road vehicle enhancing efficiency and safety on construction site.

      IoT's role is crucial, enabling real-time monitoring and predictive maintenance through extensive data collection. Cloud computing, AI, and big data work together to improve process and quality. Finally, insight from AI and big data informed decision making and adjustment. In this webinar today, our focus is on autonomous system.

      Let's quickly look into why these industries are all leaning toward smarter machines. First up, workforces. It is getting tough finding skilled workers. Plus, many are nearing retirement. Secondly, many tasks carry a higher risk of fatalities and some require meticulous precision.

      Technology wise, the automotive industry has shown many successful implementations of autonomy. Lastly, economics. Despite the high initial investment, the return on investment is compelling. Lower labor costs, fewer errors, and staying ahead of the competition are significant advantages. So those are the four main drivers behind the shift towards smarter machine learning.

      For off-road vehicle operation, activity can be broadly categorized into these three tasks, survey, transportation, and earth moving and handling. To optimize and improve these tasks, technology provides us with powerful tools in automation and autonomy involved by these three technology elements, perception, planning, and control.

      So these now form the core element of autonomous off-road vehicle. These technologies enable off-road machinery to intelligently perceive their surroundings, monitor dynamics in the object, and strategically navigate their courses. Around this core element, we have a vehicle platform.

      Next, it is crucial to deploy your implementation to the platform. Finally, through system level modeling and simulation, we can ensure that our off-road machinery align with the customer requirement and expectation, delivering an unparalleled performance and reliability.

      But we will also encounter a variety of challenges along the way, such as, how do I accurately model to actuator for my excavator? What is the best algorithm to optimize the operation of my pit mining truck? How good do my sensor need to be to accurately localize and plan the motion for my backhoe?

      In order to address these challenges and validate your system, simulation is a powerful tool for analyzing and optimizing your design, specifically when your system is complex. For many workflows, the physical system is replaced with a digital twin for simulation. Simulation play a crucial role through all stages of the development process and entire product lifecycle, each requiring different levels of fidelity.

      The simulation at different levels of fidelity include multi-body dynamics modeling and simulation, kinematic models and motion simulation, and high level task scenario simulation. MATLAB is a centralized development environment with model-based design. It enables you to progressively simulate your system from low level specific algorithm to high fidelity system integration, minimizing risk and development time. This process can be iterative as early as possible to validate and verify your system design using digital twin.

      Christoph and Cameron will talk about the details of these for digital modeling, autonomous algorithm development and testing, and validating tasks in scenario simulation. They will also showcase both earthmoving and transport user cases. Now, I would like to turn it to Christoph to take you through the detail of designing simulation and testing of the vehicle.

      Thank you very much, YJ. Yes, so let's move on to the hands-on examples here. We have two case studies. We have an excavator, an autonomous excavator, and we have a pit mining truck, which tries to autonomously navigate. And we want to guide you through all the steps of these examples.

      So in the following section, we will cover three overarching topics of off-road autonomy. First topic is physical modeling and dynamic simulation of our machines. For example, mechanical or hydraulic models of an excavator, modeling contacts between wheels and tracks and the ground, modeling dynamics of electric motors of batteries, and so on.

      The second topic is kinematic motion simulation. For example, trajectory planning for excavator arm, terrain aware off-road path planning for the pit mine, which mostly relies on more abstract representations like Ackermann models for steering or a task space motion models. And finally, we're going to co-simulate our physical and kinematic simulations with a 3D environment. We're going to use it to model sensors and to do high level task and scenario simulation.

      And when we put all of these things together, it's going to look a bit like this. So the final product will be the path planner to plan the trajectory for our excavator arm. And then, you close the loop. We have a physical model, which does the mechanical and hydraulic dynamic simulation and we connect it to our 3D simulation environment to have a visualization of what it looks like in real.

      So first part, let's go step by step, multi-body dynamic simulation. I want to first give a shout out to a bunch of really advanced, really cool reference applications which we've been developing for this off-road space for heavy machinery. All of these examples are available to the public. So you can just download them.

      They form a really good basis to start your own modeling. So they give a very good reference examples for an excavator, wheel loader, and tracked vehicles. They also go beyond simply the physical modeling. They also offer tools for optimization and design. For example, the wheel loader also has a functionality to optimize the continuous variable transmission. And we'll see, for the excavator, what else you can do except only modeling and simulating the dynamics in the next few slides.

      So first question is, how do we get to our model in Simulink? We start from a CAD model. You see on the left here, this is from SOLIDWORKS. We have import workflows for pretty much any CAD tool out there. For SOLIDWORKS, there's a simple plugin which you can call and it's going to transfer your CAD model into a Simscape multibody model. So it's going to port over all your parts. It's going to grab the meshes. It's going to use the mates in the CAD model to create the joints.

      And it saves a lot of work, essentially. It's not quite a one step process. You still need to do some manual processing. But it saves a lot of work compared to starting from a blank canvas. If you want to learn more about how this actually goes, this workflow, then here's a small QR code and a video by some colleagues of ours. They really demonstrate nicely how this import workflows function.

      And once you have this model imported, you can do a bunch of things. You can test integrated hydraulic, mechanical, electrical designs. You can review the machine behavior. So there's a small 3D representation of our excavator down here. And of course, you can verify the machine performance. So here, for example, we're looking at the various forces acting on our swing, boom, and stick.

      We don't only have physical model, though, we can also extend it with the controller with our operator inputs and we can create a simulation model that really matches a real system architecture. The physical model, it contains the mechanical parts, which we've gotten from CAD. It also incorporates hydraulic actuators. So we have the pumps and the valves, which are used to activate the cylinders.

      And it can be used, for example, to calculate performance metrics following ISO standards. So that's what's being done in our reference example. There is an International standards that specifies how to measure digging forces, for example. So it enables easier comparison between different excavators. It's usually reported on the data sheets, so that's where we can get that from.

      And if you're designing that, you need to somehow calculate this. So we created an app in this example that helps the user to calculate and report those forces. So you can change and optimize the geometry of your excavator arm. You can change the pin locations. Then you can run your physical simulation model. It's going to calculate those forces and going to present them nicely here.

      So we created this nice app here, which helps the user to calculate and report those digging forces so you can play with the geometry of your excavator arm. You can change the location of the pins. And then, it's going to run this physical dynamics simulation. It's going to calculate the necessary forces according to the standard and display them in this nice overview.

      Another application of the physical model would be efficiency calculations. For example, gravity can be used to essentially save energy when the excavator arm is moving down. So there are periods, for example, here of negative energy, which we see in this graph on the right hand side. This shows where gravity helps to lower the arm and we don't need to expend extra energy for that. And we can do some detailed calculations which show that the efficiency is, indeed, higher if we use regeneration valves. And it also shows us by how much so we can perform trade-off studies or optimize the design again.

      Another very relevant topic for off-road machinery is simulation of ground contact. And here, we show one example of how to do that by co-simulating with discrete element methods. So on the right hand side, you see this is done with this ThreeParticle simulation tool, which is a software that really allows to do high fidelity simulation of contact between our shovel and rocks, gravel, sand, things like this.

      Now usually, discrete element method calculations, they take a very long time. So here what was done, this was computed offline and then essentially a playback was co-simulated with Simulink to enable a real-time simulation of ground contact. So that's that for physical modeling. And now, let's move on to the second part, which is kinematic motion simulation.

      Again, the first question is, how do we get representation of our excavator which you can use for motion simulation? Here, what we leveraged is a URDF exporter from SolidWorks. So it's a small open source tool which allows us to export the kinematic motion model into MATLAB. It's a very straightforward process. On MATLAB side, which is called Import Robots, and we get this mathematical representation alongside, again, with the meshes of our parts so we can also visualize everything.

      Now, in order to do motion planning, first, of course, we need a map of our environment. Otherwise, we don't where to collide. So here we have LiDAR data. I'll go into detail a bit later how we acquired this LiDAR data. But you can see we have four LiDARs and we use MATLAB to process those point cloud measurements and we end up with some houses around us. There's some dumpsters and there's the excavator itself because the light also picked that up.

      So the first step was to use the CAD mesh from the import. We have to actually segment out the excavator from our point cloud. Then, we convert this point cloud to a 3D occupancy map. And again, we can now visually insert our mesh and we can confirm that everything lines up. And then, we can plan and visualize a collision free trajectory for our excavator that doesn't hit anything in our environment.

      So for motion planning, there's a few main concepts here. It involves inverse kinematics solvers, collision modeling, and then, collision free motion planners. And in MATLAB, there's tools for all of these. The heavy lifting here is done, I think, by the collision free motion planners. And again, there's two main approaches. There's sampling-based planners. So we call out here a specific function manipulator RRT, just for your reference. And there are optimization-based planners to perform optimization-based trajectory planner.

      And you can also customize the planners for more complex systems. So if you have a more specific application, you can insert your own state space and validate your functions as well. And yeah, we used manipulatorRRT plus manipulatorCHOMP to generate this collision-free trajectory. And now, we move on to the third part, which is this high level scenario simulation.

      So why do we actually do 3D scenario simulation? There's a bunch of reasons from our view-- point of view. So visuals are universal. That's the first point. It's easy to debug issues by looking at how it moves. And it can really help to sell your concept or your product. So that's a point that's not to be overlooked.

      Second, 3D environments are very data rich. So you can simulate advanced sensors, cameras, radar, LiDAR. Or ray tracing, in general, enables other sensor simulations as well. And without the 3D environment, this is very, very hard to reproduce. And finally, it allows you to do a simulation of complex scenarios. So you can create scenes with multiple actors which are moving. And you can test and validate the complete system, including the sensors, including the autonomous tech in simulation.

      As far as MATLAB goes, we have a direct link to Unreal Engine. So we support closed loop deterministic simulations where MATLAB Simulink usually takes the physics of the main actors as well as all the logic, perception, planning, controls. And the Unreal Engine does the rendering, the physics of not so important actors, and collision detection and sends back information to MATLAB.

      Now, this is a lock step co-simulation. That's a big advantage here. Solvers take turns. So it's deterministic. If you rerun the simulation, you will always get the same results and you don't run into funny issues with communication delays and things like this.

      However, if you have another environment that is not Unreal, we also support other simulators. You can use ROS, ROS2, or TCP, UDP connectors to connect to pretty much anything that's out there. Unity, Carla, Isaac, Jim. And you also have a pretty broad range of reference examples that help you get started if you're interested in connecting to any of these environments.

      Now, for our excavator here, how do we do this? Again, we start from a CAD model because we really wanted to have the same ground truth for all of our different applications. We export this CAD model. There is an exporter from SOLIDWORKS that exports directly to Unreal called Datasmith. And again, this is a pretty seamless process as long as your coordinate systems line up. So we could get this 3D model into Unreal.

      And then, the first thing-- first proof of concept was to just co-simulate our physical model with our 3D simulation model. So the physics are all done in MATLAB Simulink and we just impose the positions, the angles of our excavator on our 3D model. And you can see on the left hand side, bottom left, this is a more abstract mesh representation and how it lines up with the model in the 3D environment.

      Now, for motion planning, we need LiDAR data. And this is how I acquired the LiDAR data. I placed four LiDAR sensors in my virtual environment and I just got the point clouds into MATLAB. And you see on the left hand side, there's a small list of other types of sensors which you can simulate using our tools.

      And again, putting everything together, we get this complete system simulation where you have a path planner that takes LiDAR data from our virtual 3D scene. It then does a physical model dynamic simulation. There's controllers in the loop and back to the 3D representation. And something we didn't do yet, but of course you could do that as well, you could now also incorporate the LiDAR data in the loop to do dynamic obstacle avoidance, for example, and many more very advanced algorithms.

      And that's the end of my part. So I've shown you how to autonomously move the earth into the dumpster, and Cameron is going to show you how to move the dumpster from A to B. So Cameron, show us where A and B are in your example, please.

      So thank you, Christoph. As I mentioned, what we'll now be focusing on is the autonomous navigation task for hauling material between multiple locations within that mine. So at a high level, what we're trying to do is pretty simple. We're trying to get from point A to point B. So you have-- you've recently loaded up your haul truck or you've recently dropped off and now you need to go to the counterpart destination.

      And as you can see in a scene like this, that path might not be simple. It's not a straight line. So where do we start? What subsystems and tasks do we need? That might be the best way to start breaking this problem down. So in the reference application that we constructed, we broke this down into four pieces.

      The first thing we wanted to do is we wanted to determine what was safe versus dangerous terrain and to classify this so that downstream planners could use this to safely navigate from point A to point B. Once we had this representation, we needed to develop planners to actually find those feasible paths between those two points.

      And then, we need to create controllers for following the path both safely by avoiding obstacles and by taking into account the kinematics of the vehicle. And lastly, a framework for tying it all together. So we'll start with the terrain classification.

      In a lot of autonomous applications, you have road networks where they largely dictate where you can or can't go. When you are using Google maps to get from point A to point b, you're using roads that exist. You're not really going off-road. The challenge when going to these off-road terrains is that you don't necessarily have those road networks, although they may exist.

      So what we want to do is first start with point cloud data taken from USGS, United States Geological Survey. So we have raw point cloud data that's basically generated by aerial LiDAR. We can stitch it together. And then, from there, we want to extract or we want to infer information from this raw format.

      So what we do is we first convert that point cloud to an elevation map. And we also fill in any holes that are missing using interpolants. From there, we can use a gradient and a threshold to determine what is too steep to traverse. And by doing this, you can see the beginnings of a road network start to emerge. Once done, we do some morphological operations on this binary mask to basically clean up the image and further solidify those boundaries of safe versus dangerous.

      And then, from there, some additional morphological operations help us to identify, basically, the end points or branch points within this network or within this space. So basically, we shrink that free space image down to a bunch of line segments. And now, this is starting to look much more like something you might see out of Google Maps.

      Lastly, we run an algorithm which extracts the edges from this image. So now we have a big set of edges or a big set of roads in our space. But it's not connected. So right now, it's just a bunch of loose paths. The next part is we need to find a way to actually use that information to go from point A to point b. And so we will focus now on the planners.

      One difficulty that you may encounter when you start working in autonomous motion planning is that there's quite a few options. There's quite a few ways to tackle this problem. So where do you start? Well, first you sort of have to choose what style of planner you want. And they can fall into a bunch of different categories, as Christoph started to mention earlier.

      There might be different constraints you have on the planner types, such as geometric planner versus one that takes into account vehicle kinematics versus one that takes into account dynamics. You also can break this into different types of planners, how they operate. So there are sample-based planners, search-based planners, optimization-based planners, to name a few.

      Additionally, you could have planners that only need to work offline, coverage planners. Or you need online planners, which need to be working in real-time repeatedly or at a moment's notice. And then, there's a question of determinism. So do you need a planner that gives you the same result every time, or does a probabilistic solution suffice?

      So one thing that we can do is we can look at the doc. So on the navigation tool boxes doc, we have a table that sort of highlights the differences between multiple planners that we ship that break down the type of search, what it's good for, how you can customize it, and plenty more information. So this is a good place to get started.

      So when it comes to our problem, what are our requirements? If we had our ideal planner, what would it be or what would make it ideal? We might want it to be performant. We might want it to take into account the kinematics of our vehicle. We might want-- need to be optimal or maybe it doesn't need to be optimal.

      And then, is this planner supposed to operate on global-- a global problem or is this just a local problem? So why don't we propose here three separate planners and take a look at how they operate. So plannerAstar is a graph-based planner. You construct a graph that is known a priori. And you're just trying to find a route from point A to point B within that graph.

      PlannerHybridAstar falls into the search-based planning category. It's basically using pre-canned motion primitives to explore space. And that space is common SC2. And then, you have controllerTEB, which is a local planner. So if we look at plannerAstar, in terms of performance, it's very fast. These are graph-based planners. They're basically looking through a predefined set of edges. So it's very quick, very lightweight.

      It doesn't take into account the kinematics of your vehicle. And you could say that it's optimal. It's optimal within the graph that you feed it, but it's not universally optimal or it's not optimal in the sense of an MPC planner or an optimization problem that maybe you're familiar with.

      PlannerHybridAstar, if we look at that, if you compare the performance between plannerAstar, a graph-based planar, and plannerHybridAstar, the performance isn't going to be as good. But it does adhere to the kinematics of a vehicle. Again, it's not necessarily an optimization planner. It will give you-- it will be optimal on the set of primitives that are generated. And typically, that's good enough. And

      Then lastly, we have controllerTEB, which is fast. It respects the kinematics of your vehicle and it's a true optimization-based planner. It's actually going to tweak a bunch of planning variables and give you an optimal result. So the obvious answer here is controllerTEB, right? Not necessarily.

      So if we think about or if we go back to the global versus local distinction, we see that the plannerAstar and plannerHybridAstar are global planners. They don't need to be initialized with a guess. And they're really meant for giving you a high level route between two points. Whereas, controllerTEB needs a reference path. And it's really only trying to find the best solution along that reference path within the vicinity of the vehicle.

      So that brings you to the question of which one is best, right? And the answer is all. You use each one where it fits. So in our navigation stack, what we decided on is we have an efficient high level planner, which is our road network planner. This feeds into a terrain-- or this result, basically, is appended with a terrain aware onramp planner for getting onto and off of that road network since the road network is-- it doesn't cover everything, as we saw earlier.

      And then, we have a robust local planner, which essentially will loop between generating a local reference path along the global reference path and then following that reference path with a fast MPC-based controller. And then, lastly, once we reach the end of our road network route, we use our terrain-aware offramp planner to plan a path between the end point of our road network and the desired true destination.

      So here we can see our graph-based route planner. Again, you can see the sparse nature of the graph within our space. But the goal here is to basically just provide a very fast initial route for going from point A to point B. Here, you can see the mouse is the start location. And we're trying to navigate all the way to the bottom of the mine.

      Once we reach the bottom of the mine or as we need to get onto that path generated by the graph planner, we use our terrain-aware free-space planner. Basically, this is plannerHybridAstar. And we have an additional customized cost function, which takes in that slope mask that we showed earlier and can penalize trajectories that go along steep terrain. And we use that to guide the path and give a more optimal solution. Again, this is plannerHybridAstar.

      So now we have our global path, but we need to be able to follow it. So that brings us to our local planner and controller. So our local planner is responsible for creating an optimal path in the local vicinity of the vehicle. And then, we're going to follow that path using a non-linear MPC controller. In the video above, you can basically see how we have a fairly jagged reference path, which is coming from our global planner. And then, the controllerTEB is providing a smooth reference path in the vicinity for MPC to generate control signals along.

      And MPC here is used to actually ensure we get a smooth path and also to minimize any drift that we may pick up due to the real-world interaction of our vehicle with terrain.

      So now we have all the different pieces of our navigation stack, we need a way to integrate them together. So the last step is tying these together. And what we use here is Simulink and Stateflow. So in the reference application, if you open up the last example, this will open a Simulink model where we essentially have broken or separated our different planning pieces into subsystems.

      And lastly, we integrate everything using Simulink and Stateflow. We use this chart to transition between all of the different states. In the latest release, we also added support for Unreal. So here, we'll see a demo where we change the simulation mode to Unreal. This launches our open pit mine scene, which is using all of the navigation stack elements that we described earlier to plan a path.

      And the top-- or on the left hand side, we can see, basically, a sign distance map of the region surrounding the vehicle. The farther you get away from obstacles, the larger that distance value. And you can see how a controllerTEB, essentially, repulses the vehicle away from local obstacles while also generating a path that adheres to the nonholonomic constraints of our vehicle.

      From there, the path that's generated is followed using the MPC controller. And the vehicle commands are being sent to an agent in Unreal, which uses them to update the vehicle using Unreal's actual physics. And then, lastly, we retrieve the current pose of that vehicle and feed it back into our planning stack to close the loop. And here, we can just see the vehicle operating using actual physics.

      So lastly, I'll hand off to YJ to summarize-- or to give a final summary and call to action. Thank you.

      Thank you, Cameron. So we have reached the end of our webinar today. Let me quickly recap the key points that we have covered and leave you with some action you can take moving forward.

      I trust you have gained a good understanding of how MATLAB and Simulink offer a cohesive environment for developing autonomous off-road vehicle applications. In particular, the simulation is crucial in the development and testing of autonomous vehicle system for designing physical systems and developing testing algorithms and validating tasks through scenario simulation.

      We offer a comprehensive suite of tools for each aspect of a development. While today's discussion didn't talk about AI, it is important to note that MATLAB and Simulink provide a complete array of solution for deep learning and computer vision. We also offer a RoadRunner product family that lets you to create 3D scenes for simulation and testing.

      In this webinar today, we focused largely on modeling and simulation. I want to highlight the essential need to create a functional safety design for the vehicles and their control systems. This design is crucial for ensuring reliability, not just under normal condition, but also in the face of the fault. To achieve this goal, your model need to be rigorously verified against system requirement, which include conducting coverage tests.

      Additionally, upon deployment, the software code must be verified to ensure traceability back to the original code. In the final step, the entire system must be verified upon integration. So model-based design provide an extensive range of verification validation method to comply with various standards.

      Engineers use model-based design with MATLAB Simulink to design complex embedded systems and generate production quality code. MathWorks tool enhance model-based design through simulation, testing, and analysis specifically tailored for functional safety design.

      We offer a detailed seminar series on each topic discussed today. If you are interested in diving deeper, please let us. We provide support through multiple channels such as training section, consulting services, workshop, and technical assistance. If you have any questions and need help with your particular project, please feel free to contact us. We are here to help and always glad to support you.

      Thank you for your attention. Now, please post your questions in the Q&A panel. We will take a few moment to review them and then return to address your questions.

      View more related videos