Borrar filtros
Borrar filtros

Problem with bus input of RL agent

3 visualizaciones (últimos 30 días)
ali farid
ali farid el 28 de Feb. de 2024
Respondida: Namnendra el 24 de Jul. de 2024
I used a block diagram of a RL agent in Simulink which in a Matlab example was used, but I modified the inputs of RL agent and I added all data on a bus. I see the error as following:
Error using rl.train.marl.MultiAgentTrainer/run
Error in 'rlAreaCoverage3fff2024/Agent A (Red)': Failed to evaluate mask initialization commands.
Error in rl.train.TrainingManager/train (line 429)
run(trainer);
Error in rl.train.TrainingManager/run (line 218)
train(this);
Error in rl.agent.AbstractAgent/train (line 83)
trainingResult = run(trainMgr,checkpoint);
Caused by:
Error using rl.env.internal.reportSimulinkSimError
Observation specification must be scalar if not created by bus2RLSpec.
How I can solve this error. My Matlab version is 2022a. Here is the screenshot of the Simulink environment.

Respuestas (1)

Namnendra
Namnendra el 24 de Jul. de 2024
Hello Ali,
The error you are encountering indicates that the observation specification for your reinforcement learning (RL) agent must be a scalar unless it is created using the `bus2RLSpec` function. This function is used to convert Simulink bus objects to RL observation and action specifications.
Here's a step-by-step guide to address this issue:
Step 1: Define the Bus Object in Simulink
First, ensure that the bus object you are using for your RL agent's inputs is properly defined in Simulink.
1. Create a Bus Object: Define the bus object in the MATLAB workspace or in the Simulink model using the Bus Editor.
2. Add Signals to the Bus: Make sure all the signals you want to include in the bus are properly added.
Step 2: Convert Bus to RL Specification
Use the `bus2RLSpec` function to convert the bus object to an RL observation specification.
1. Get the Bus Object: Retrieve the bus object from the Simulink model or workspace.
busInfo = Simulink.Bus.createObject('your_model_name', 'your_bus_name');
busObject = evalin('base', busInfo.busName);
2. Convert to RL Specification: Use `bus2RLSpec` to convert the bus object to an RL specification.
observationSpec = bus2RLSpec(busObject);
Step 3: Update the RL Agent
Update your RL agent to use the new observation specification.
1. Create or Update the RL Agent: When creating or updating your RL agent, use the `observationSpec` obtained from `bus2RLSpec`.
% Assuming you have a DDPG agent as an example
obsInfo = observationSpec;
actInfo = rlNumericSpec([1 1]); % Define your action specification as needed
env = rlSimulinkEnv('your_model_name', 'your_agent_block', obsInfo, actInfo);
% Create the agent
agentOptions = rlDDPGAgentOptions('SampleTime', 0.1);
agent = rlDDPGAgent(obsInfo, actInfo, agentOptions);
Step 4: Train the RL Agent
Train the RL agent using the updated observation specification.
% Define training options
trainOpts = rlTrainingOptions(...
'MaxEpisodes', 1000, ...
'MaxStepsPerEpisode', 500, ...
'StopTrainingCriteria', 'AverageReward', ...
'StopTrainingValue', 500, ...
'ScoreAveragingWindowLength', 10);
% Train the agent
trainingResult = train(agent, env, trainOpts);
Example Workflow
Here’s a complete example workflow assuming you have a Simulink model named `your_model_name` and a bus named `your_bus_name`.
1. Define the Bus Object:
busInfo = Simulink.Bus.createObject('your_model_name', 'your_bus_name');
busObject = evalin('base', busInfo.busName);
2. Convert Bus to RL Specification:
observationSpec = bus2RLSpec(busObject);
3. Create the RL Environment and Agent:
obsInfo = observationSpec;
actInfo = rlNumericSpec([1 1]); % Define your action specification as needed
env = rlSimulinkEnv('your_model_name', 'your_agent_block', obsInfo, actInfo);
agentOptions = rlDDPGAgentOptions('SampleTime', 0.1);
agent = rlDDPGAgent(obsInfo, actInfo, agentOptions);
4. Train the Agent:
trainOpts = rlTrainingOptions(...
'MaxEpisodes', 1000, ...
'MaxStepsPerEpisode', 500, ...
'StopTrainingCriteria', 'AverageReward', ...
'StopTrainingValue', 500, ...
'ScoreAveragingWindowLength', 10);
trainingResult = train(agent, env, trainOpts);
The above information/code gives an approach on how you can resolve the error and properly use the bus object as an observation input for your RL agent in Simulink.
Thank you.

Categorías

Más información sobre Environments en Help Center y File Exchange.

Etiquetas

Productos


Versión

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by