Design Lidar-Based SLAM Using Unreal Engine Simulation Environment - MATLAB
Video Player is loading.
Current Time 0:00
Duration 4:50
Loaded: 3.42%
Stream Type LIVE
Remaining Time 4:50
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
    Video length is 4:50

    Design Lidar-Based SLAM Using Unreal Engine Simulation Environment

    Learn how to design a lidar SLAM (Simultaneous Localization and Mapping) algorithm using synthetic lidar data recorded from a 3D environment. You can integrate with the photorealistic visualization capabilities from Unreal Engine® by dragging and dropping out-of-the-box 3D Simulation blocks in Simulink. Discover how to visualize the recorded data, develop registration and mapping algorithms for perception, correct for drift using pose graph optimization, and achieve a cleaner and accurate point cloud map. Interested in lidar processing? Explore MathWorks' Lidar Toolbox for comprehensive lidar data analysis.

    Key takeaways:

    • How to design a lidar SLAM algorithm using synthetic lidar data
    • Understand the process to integrate Simulink and Unreal Engine
    • Discover techniques to visualize, process, and optimize lidar data for accurate mapping and localization.

    Published: 11 Jan 2021

    Hey, everyone. This is Pitambar from MathWorks. In this video, we'll go over how to design a lidar SLAM algorithm using synthetic lidar data that's recorded from a 3D environment. We'll be using Unreal Engine, and I'm going to base this off the docs example that you see on the screen right now. One other thing you're going to need these toolboxes if you want to follow along. Let's jump into it.

    First, we're going to start by loading in a reference path from a recorded drive. And after loading this in, we'll visualize this path in a figure and overlay the X and Y points that make up this reference path. So let's run it so we can see it. And if we zoom into this blue line in the figure, we can see these reference path points more closely. We're going to be using Simulink to perform our simulation. So let's run this next section to open up our model, and once it opens up, I'll walk you through it.

    I'm going to break this down into the four groups that we see on the screen and explain each one. So we'll start here with our Simulation 3D Scene Configuration block. This configures the scene to the US city block that we saw earlier. Let's move to the bottom, where we have our simulation 3D vehicle with ground following block. This places a vehicle on the scene and moves this vehicle through the trajectory from the X, Y, and Yaw data that we loaded in the very beginning.

    Up on top, we have the simulation 3D lidar Block. This places a lidar sensor on top of our vehicle. And underneath it, we have an insMeasurement Block. This is a From Workspace block that loads simulated measurements from an inertial navigation sensor. The INS data is saved in a map file in our current folder. Finally, we have a subsystem on the right, which is responsible for recording and visualizing the synthetic lidar data. So with that in mind, let's go ahead and run our model.

    So when we finally run the model, we get a visualization of the car following the waypoints in Unreal Engine. We also get the visualized outputs of our record and visualize subsystem. There is the INS sensor display on top and the recorded point cloud data on the bottom. Now, it'll take some time to travel through the entire trajectory, so let's cut back to our script for the next step, where we'll use the recorded data to develop our perception algorithm.

    All right, we're back in Matlab, and I'm going to take a minute to explain the flow of this code section. So first, we're going to run this function, which does a couple of things. It pre-processes each incoming point cloud to remove the ground plane and the ego vehicle. It then registers an incoming point cloud with the previous one. Then it uses the estimated transformation to transform this point cloud to the frame of reference of the map, and then it updates the pcviewset, which is an object that manages point cloud data in Matlab.

    Next, we'll configure our map builder to detect loop closures. Now let me scroll down to this last part. Finally, we're going to loop through our INS data, which was recorded from the simulated INS measurement. We estimate a transform, update our lidar map with this new frame, and then update our top-view display.

    OK, let's go ahead and run it so you can see what it looks like. We're now looking at the top-view display of the point cloud map. I'm going to fast-forward to the end, and you'll notice that we've accumulated some drift in our data. So let's go to the next section of our Matlab script and correct for this using pose graph optimization. I'll go ahead and run this section and we can take a look at the results. The new pose graph is in orange, which has corrected for the drift.

    So finally, with this correction in place, let's look again at our updated point cloud map. It's much cleaner, and it resembles the initial loop that we saw in the overhead view. And that's where I'm going to wrap up this video. So hopefully, you have a better understanding of how to design lidar SLAM using synthetic data from Unreal Engine. Now, if lidar processing is something you're interested in, take a look at Lidar Toolbox. This toolbox has built-in algorithms for processing, visualizing, and analyzing lidar data. Otherwise, thanks for watching.

    Related Products

    View more related videos