Problem with bus input of RL agent
7 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I used a block diagram of a RL agent in Simulink which in a Matlab example was used, but I modified the inputs of RL agent and I added all data on a bus. I see the error as following:
Error using rl.train.marl.MultiAgentTrainer/run
Error in 'rlAreaCoverage3fff2024/Agent A (Red)': Failed to evaluate mask initialization commands.
Error in rl.train.TrainingManager/train (line 429)
run(trainer);
Error in rl.train.TrainingManager/run (line 218)
train(this);
Error in rl.agent.AbstractAgent/train (line 83)
trainingResult = run(trainMgr,checkpoint);
Caused by:
Error using rl.env.internal.reportSimulinkSimError
Observation specification must be scalar if not created by bus2RLSpec.
How I can solve this error. My Matlab version is 2022a. Here is the screenshot of the Simulink environment.
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1630376/image.png)
0 comentarios
Respuestas (1)
Namnendra
el 24 de Jul. de 2024
Hello Ali,
The error you are encountering indicates that the observation specification for your reinforcement learning (RL) agent must be a scalar unless it is created using the `bus2RLSpec` function. This function is used to convert Simulink bus objects to RL observation and action specifications.
Here's a step-by-step guide to address this issue:
Step 1: Define the Bus Object in Simulink
First, ensure that the bus object you are using for your RL agent's inputs is properly defined in Simulink.
1. Create a Bus Object: Define the bus object in the MATLAB workspace or in the Simulink model using the Bus Editor.
2. Add Signals to the Bus: Make sure all the signals you want to include in the bus are properly added.
Step 2: Convert Bus to RL Specification
Use the `bus2RLSpec` function to convert the bus object to an RL observation specification.
1. Get the Bus Object: Retrieve the bus object from the Simulink model or workspace.
busInfo = Simulink.Bus.createObject('your_model_name', 'your_bus_name');
busObject = evalin('base', busInfo.busName);
2. Convert to RL Specification: Use `bus2RLSpec` to convert the bus object to an RL specification.
observationSpec = bus2RLSpec(busObject);
Step 3: Update the RL Agent
Update your RL agent to use the new observation specification.
1. Create or Update the RL Agent: When creating or updating your RL agent, use the `observationSpec` obtained from `bus2RLSpec`.
% Assuming you have a DDPG agent as an example
obsInfo = observationSpec;
actInfo = rlNumericSpec([1 1]); % Define your action specification as needed
env = rlSimulinkEnv('your_model_name', 'your_agent_block', obsInfo, actInfo);
% Create the agent
agentOptions = rlDDPGAgentOptions('SampleTime', 0.1);
agent = rlDDPGAgent(obsInfo, actInfo, agentOptions);
Step 4: Train the RL Agent
Train the RL agent using the updated observation specification.
% Define training options
trainOpts = rlTrainingOptions(...
'MaxEpisodes', 1000, ...
'MaxStepsPerEpisode', 500, ...
'StopTrainingCriteria', 'AverageReward', ...
'StopTrainingValue', 500, ...
'ScoreAveragingWindowLength', 10);
% Train the agent
trainingResult = train(agent, env, trainOpts);
Example Workflow
Here’s a complete example workflow assuming you have a Simulink model named `your_model_name` and a bus named `your_bus_name`.
1. Define the Bus Object:
busInfo = Simulink.Bus.createObject('your_model_name', 'your_bus_name');
busObject = evalin('base', busInfo.busName);
2. Convert Bus to RL Specification:
observationSpec = bus2RLSpec(busObject);
3. Create the RL Environment and Agent:
obsInfo = observationSpec;
actInfo = rlNumericSpec([1 1]); % Define your action specification as needed
env = rlSimulinkEnv('your_model_name', 'your_agent_block', obsInfo, actInfo);
agentOptions = rlDDPGAgentOptions('SampleTime', 0.1);
agent = rlDDPGAgent(obsInfo, actInfo, agentOptions);
4. Train the Agent:
trainOpts = rlTrainingOptions(...
'MaxEpisodes', 1000, ...
'MaxStepsPerEpisode', 500, ...
'StopTrainingCriteria', 'AverageReward', ...
'StopTrainingValue', 500, ...
'ScoreAveragingWindowLength', 10);
trainingResult = train(agent, env, trainOpts);
The above information/code gives an approach on how you can resolve the error and properly use the bus object as an observation input for your RL agent in Simulink.
Thank you.
1 comentario
feng qi
el 23 de Dic. de 2024
Hello! I have tried your instruction. But at step 1, there is an error:
错误使用 slbus_get_struct
Error due to multiple causes.
出错 sl_feval
出错 Simulink.Bus.createObject>createObjectFromBlks (第 173 行)
busInfo = sl('slbus_get_struct', model, blks, false);
出错 Simulink.Bus.createObject (第 153 行)
busInfo = createObjectFromBlks(model, blks, fileName, format, dataAccessor); - 显示完整堆栈跟踪
原因:
错误使用 slbus_get_struct
计算 'rlMultiAgentPFC/RL Agent1' 中的参数 'Agent' 时出错 - 显示完整堆栈跟踪
错误使用 slbus_get_struct
函数或变量 'agent1' 无法识别。 - 显示完整堆栈跟踪
错误使用 slbus_get_struct
Variable 'agent1' does not exist.
建议的操作:
• Load a file into base workspace. - 修复
• Create a new variable. - 修复
- 显示完整堆栈跟踪
错误使用 slbus_get_struct
计算 'rlMultiAgentPFC/RL Agent2' 中的参数 'Agent' 时出错 - 显示完整堆栈跟踪
错误使用 slbus_get_struct
函数或变量 'agent2' 无法识别。 - 显示完整堆栈跟踪
错误使用 slbus_get_struct
Variable 'agent2' does not exist.
建议的操作:
• Load a file into base workspace. - 修复
• Create a new variable. - 修复
- 显示完整堆栈跟踪
It seems that the RL agent needs to be created first before Define the bus object, but, but I can't can't because the observation needs to be specified based on bus signal.
How can I deal with this problem? Please help me~
Ver también
Categorías
Más información sobre Training and Simulation en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!