How to ensure "BeginOverlapEvent" triggers only once or resets when updating episodes in a quadcopter reinforcement learning task?
4 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I am working on reinforcement learning for collision avoidance of a quadcopter on the Unreal Executable. I am using the Reinforcement Learning Toolbox and the RL Agent block for this reinforcement learning task. I am also using the quadcopter model from the following link:
https://jp.mathworks.com/help/sl3d/simulate-a-quadcoptor.html
To detect collisions with obstacles, I use the "BeginOverlapEvent" of the body, and I intend to end the episode and move to the next one after a collision is detected. However, in practice, once a collision is detected, the "BeginOverlapEvent" continues to trigger multiple times. Even after the episode is updated and the quadcopter tries to fly from the initial position again, it continues to register collision detections, causing the remaining episodes to terminate as well.
I would like to know how I can ensure that the "BeginOverlapEvent" only triggers once or how to reset this event when the episode is updated.
0 comentarios
Respuestas (1)
Nishan Nekoo
el 17 de Sept. de 2024
Editada: Nishan Nekoo
el 17 de Sept. de 2024
Hello! There is a known issue with Overlap Events when the objects are both moving in the same direction. Is this the case for your simulation?
Nonetheless, your later statement about collisions being registered even when resetting to the initial position is interesting. How many additional collisions are being registered? Is it possible that the previous object is not being deleted and you are creating a new instance of the object? It is a little tricky to understand what your setup is and how you are resetting to the initial position. Could you provide a minimum reproduction model that we can take a look at to investigate and suggest some workarounds?
If you are unable to provide that here, please reach out to support@mathworks.com and they will be able to assist you with this.
Nishan
0 comentarios
Ver también
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!