Observation specification must be scalar if not created by bus2RLSpec.
3 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I am using a RL system that is initially designed for one type of observation which is image. Recently I added two scalar observations in addition to that image, and everything regarding to my modifications is working fine, but The only issue is when I am trying to use Simulink, there is a new error. I am not sure how to resolve it?
Here is the Simulink:
Here is the error:
Error using rl.train.marl.MultiAgentTrainer/run
Error in '
rlAreaCoverage32024/Agent A (Red)': Failed to evaluate mask initialization commands.
Error in rl.train.TrainingManager/train (line 429)
run(trainer);
Error in
rl.train.TrainingManager/run (line 218)
train(this);
Error in
rl.agent.AbstractAgent/train (line 83)
trainingResult = run(trainMgr,checkpoint);
Caused by:
Error using rl.env.internal.reportSimulinkSimError
Observation specification must be scalar if not created by bus2RLSpec.
0 Kommentare
Antworten (1)
Poorna
am 22 Apr. 2024
Hi ali farid,
The error you are getting is related to the observation that you are passing to Agent A.
The cause of the error is indicated as: "Observation specification must be scalar if not created by bus2RLSpec".
This means that the observation specification must either be a scalar, or if it needs to be of bus type then, "bus2RLSpec" function must be used to convert the bus object to an RL specification object. I suppose that when you say that you have added two more scalars as observation along with the image you have used the bus creation block to do so. If it is the case then, you could try using the "bus2RLSpec" function to convert the bus object to RL spec before passing the observation to the agents.
To know more about the "bus2RLSpec" function, refer to the following documentation:
Hope this Helps!
0 Kommentare
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!