ExperienceBuffer has 0 Length when i load a saved agent and continue training in reinforcement training
3 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Yikai
el 5 de Abr. de 2021
Comentada: Dmitriy Ogureckiy
el 12 de En. de 2023
Hi all,
I'm trying to train a saved agent further. In the training option of this saved agent, the SaveExperienceBufferWithAgent is set to true. But when I load the saved_agent and open the property ExperienceBuffer I noticed the Length is 0. I tried to look in the documentation of such property but the there is no information on it. If I stop a training and directly check the property "Length" of the agent in the workspace, it has some value.
My question would be what does this "Length" mean? If it's 0, when I perform training further with a saved agent like in https://de.mathworks.com/matlabcentral/answers/495436-how-to-train-further-a-previously-trained-agent?s_tid=answers_rc1-2_p2_MLT , does it really continue training with saved agent and with saved expeirence buffer?

Yours
0 comentarios
Respuesta aceptada
Takeshi Takahashi
el 20 de Abr. de 2021
Length 0 means there isn't any experience in this buffer. I think it didn't save the experience buffer due to this bug. Please set agent.AgentOptions.SaveExperienceBufferWithAgent = true immediately before saving the agent.
2 comentarios
Dmitriy Ogureckiy
el 12 de En. de 2023
Can I ask you, does networks weights saved when agent saved between simulations?
Más respuestas (0)
Ver también
Categorías
Más información sobre Introduction to Installation and Licensing en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!