How can I do centralized tracking with multi sensors?

11 visualizaciones (últimos 30 días)
Aatif Sulaimani
Aatif Sulaimani el 31 de Jul. de 2023
Comentada: Aatif Sulaimani el 1 de Ag. de 2023
Hi, I would like to know how can I implement centralized tracking with multiple sensors. I have camera and radar and I want to fuse them and create one track for a target using trackerGNN, trackerJPDA and trackerMHT. My tracks's state will be of world coordinates. I would like to know if my detection input of the tracker should always be in the same frame as track state (here world coordinates)? I am trying to convert my measurements from camera and radar to world coordinates in my measurement function of trackingEKF using converted measurement model. So my measurements are still in their respective frames. Will it be possible to achieve centralized tracking using this?
Thank you in advance,
AS

Respuestas (1)

recent works
recent works el 31 de Jul. de 2023
Centralized tracking with multiple sensors involves fusing the measurements from different sensors into a common state representation and updating a single track for the target. Here's a general outline of how you can achieve centralized tracking using trackerGNN, trackerJPDA, and trackerMHT with multi-sensor data:
  1. Sensor Measurements and Fusion:
  • Obtain measurements from your camera and radar sensors. These measurements will be in their respective sensor frames.
  • Implement a sensor fusion algorithm to convert the measurements from different sensors into a common world coordinate frame. This process is called "sensor fusion" or "data association."
  • The goal is to establish correspondences between measurements from different sensors and associate them with the same target. Various techniques like Kalman filters, data association algorithms (e.g., nearest neighbor, Hungarian algorithm), or probabilistic methods (e.g., Joint Probabilistic Data Association - JPDA) can be used for this purpose.
  1. Centralized Tracking:
  • Implement your selected centralized tracking algorithm (e.g., trackerGNN, trackerJPDA, trackerMHT) using the fused measurements in the common world coordinate frame.
  • These algorithms use the fused measurements to update the target track's state and perform data association, as necessary, to handle sensor uncertainties and associations.
  1. Measurement Function and Conversion:
  • In your tracking Extended Kalman Filter (EKF) implementation, you'll have a measurement function that converts the predicted state to a measurement in the sensor's frame.
  • Since your tracks' state is in the world coordinate frame, you need to convert the measurements from the camera and radar sensors to the world coordinate frame using the converted measurement model. This involves transforming the sensor measurements from their respective frames to the world coordinate frame.
  • The converted measurements in the world coordinate frame will then be used for data association and track updates in the centralized tracking algorithm.
  1. State and Measurement Covariance:
  • When fusing measurements from multiple sensors, it's crucial to consider the covariance of both the state and measurements.
  • The covariance reflects the uncertainty associated with the sensor measurements and the track's state, and it plays a crucial role in the data association process.
In summary, to achieve centralized tracking with multi-sensor data, you need to properly fuse the measurements from different sensors into a common world coordinate frame and then use these fused measurements in your centralized tracking algorithms like trackerGNN, trackerJPDA, or trackerMHT. Your converted measurements in the world coordinate frame will be used for the data association and track updates within the centralized tracker.
  3 comentarios
recent works
recent works el 31 de Jul. de 2023
Your approach is generally on the right track for centralized multi-sensor tracking.
Aatif Sulaimani
Aatif Sulaimani el 1 de Ag. de 2023
That's great to know. But I still have couple of questions,
1) So, If I have only camera measurements for this time instance, the tracker will predict the state using the kalman filter model for camera from the sensor index I specified. Now , it will look for detections to assign and if there is a radar detection which satisfies the assignment cost, it should assign the track to this radar detection. For, the next prediction step, will it switch to the radar kalman filter model specified by sensor index ?? as now it should update with radar measurements so to calculate residual from this equation :
y = Z - h(x)
y is residual,
Z is measurements,
h() is measurement fuction which maps my state to measurement space,
x is my track state in world coordinates.
2) If the normalized distance is calculated from the residual for the assignment, My residual calculation happens in measurement space but I would like to track in state space which is world coordinates. How can I achieve this?

Iniciar sesión para comentar.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by