Sensor fusion algorithms can be used to improve the quality of position, orientation, and pose estimates obtained from individual sensors by combing the outputs from multiple sensors to improve accuracy. By fusing data from multiple sensors, the strengths of each sensor modality can be used to make up for shortcomings in the other sensors. In addition, systems that perform sense and perception need to keep track of a very large number of objects to maintain complete situational awareness.
In this webinar, you will learn about algorithms and tools to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. The webinar will include several reference examples that provide a starting point for airborne, ground-based, shipborne, and underwater sensing systems.