ExperienceBuffer has 0 Length when i load a saved agent and continue training in reinforcement training

Hi all,
I'm trying to train a saved agent further. In the training option of this saved agent, the SaveExperienceBufferWithAgent is set to true. But when I load the saved_agent and open the property ExperienceBuffer I noticed the Length is 0. I tried to look in the documentation of such property but the there is no information on it. If I stop a training and directly check the property "Length" of the agent in the workspace, it has some value.
My question would be what does this "Length" mean? If it's 0, when I perform training further with a saved agent like in https://de.mathworks.com/matlabcentral/answers/495436-how-to-train-further-a-previously-trained-agent?s_tid=answers_rc1-2_p2_MLT , does it really continue training with saved agent and with saved expeirence buffer?
Yours

 Respuesta aceptada

Length 0 means there isn't any experience in this buffer. I think it didn't save the experience buffer due to this bug. Please set agent.AgentOptions.SaveExperienceBufferWithAgent = true immediately before saving the agent.

2 comentarios

Yikai
Yikai el 20 de Abr. de 2021
Editada: Yikai el 20 de Abr. de 2021
thanks for answering. I will try this
Can I ask you, does networks weights saved when agent saved between simulations?

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Reinforcement Learning Toolbox en Centro de ayuda y File Exchange.

Productos

Versión

R2020b

Preguntada:

el 5 de Abr. de 2021

Comentada:

el 12 de En. de 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by