rlDiscreteCategoricalActor not accepting a mix of rlNumericSpec and rlFiniteSetSpec objects - observation for a RL environment
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Pruthwiraj Santhosh
el 11 de Mayo de 2023
Comentada: Pruthwiraj Santhosh
el 15 de Feb. de 2024
I am looking for an example of which implements a mix of rlNumericSpec and rlFiniteSetSpec object in an RL environment (as mentioned here). Some of my observations are numerical/continuous whereas others are finite/discrete.
I created a set of obervations which is mixture of rlNumericSpec and rlFiniteSetSpec objects using the following code:
obsInfo_numeric = rlNumericSpec([4 1]);
obsInfo_finite = rlFiniteSetSpec([1 1]);
obsInfo = [obsInfo_numeric,obsInfo_finite];
and a set of actions using:
actInfo = rlFiniteSetSpec([1 2 3 4 5]);
I also created a network called 'actnet' with 4 inputs and 1 output:
But when I try to create an actor using the observations and actions, I am getting an error:
actor = rlDiscreteCategoricalActor(actnet,obsInfo,actInfo);
0 comentarios
Respuestas (1)
Narvik
el 25 de Ag. de 2023
Hi,
I understand that you faced an issue when using a combination of discrete('rlFiniteSetSpec') and continuous('rlNumericSpec') observation data specifications. The function 'rlDiscreteCategoricalActor' accepts a combination discrete and continuous observation data specifications. Please find an example in the documentation below :
I advise you to check your neural network and action space and make sure that the input layers match the number of observation channels. Please find some helpful documentation links provided below :
https://in.mathworks.com/help/reinforcement-learning/ref/rl.function.rldiscretecategoricalactor.html
Hope this helps!
1 comentario
Ver también
Categorías
Más información sobre Agents en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!