Filter löschen
Filter löschen

An error occurred while simulating mode with the reinforcement learning agent

16 Ansichten (letzte 30 Tage)
I’m using reinforcement learning toolbox to train energy management controller of hybrid electric vehicle (HEV) model. I am using 5 (five) continuous state variables as input to the RL agent and 3 continuous action variables are output from the RL agent. Hence, I have used a DDPG agent since DDPG agent can handle both continuous state and action variables. My environment is comprised of vehicle plant model, reference drive speed profile, and immediate cost calculation.
For reference, I have followed the “rlwatertank” example from RL toolbox help guide since DDPG agent is also used in that example too.
Before training my agent, I made sure the whole model (agent and environment) runs without any error and warning with the command sim(‘QLearningEMSdiscreteDelay’). It ran smoothly. But, when I’m trying to train the agent through training options, MATLAB is giving the error as following:
Error using rl.train.seriesTrain (line 16)
An error occurred while simulating “QLearningEMSdiscreteDelay” the agent “agent”
Error in rl.train.TrainingManager/train (line 244)
Rl.train.seriesTrain(this);
Error in rl.train.TrainingManager/run (line 150)
train(this);
Error in rl.agent.AbstractAgent/train (line 54)
TrainingStatistics = run(trainMgr)
The only differences my model has from the “rlwatertank” example are following:
  1. My environment has a time-series of reference speed compared to a single value of water level in “rlwatertank” example
  2. There is no randomization of this reference time-series at the starting of each episode
  3. Time-step of my agent and environment is 0.1 seconds compared to 1 second in “rlwatertank” example
Why can't the agent start the simulation through training while it can be simulated manually?
Any help will be appreciated.
  2 Kommentare
Emmanouil Tzorakoleftherakis
Can you share the full error message? The script where you set everything up would be helpful too.
Thanks!
Hajar hammouti
Hajar hammouti am 5 Nov. 2019
Wondering if you have solved this error. I obtain the same error when training my qAgent and I don't know from where it comes.

Melden Sie sich an, um zu kommentieren.

Antworten (2)

Atriya Biswas
Atriya Biswas am 5 Nov. 2019
Bearbeitet: Walter Roberson am 31 Jan. 2022
Now my agent is able to train the Simulink model. But the training stops after only one training episode. The trainingOptions in the rlwatertank example are same as mine, but my training stops after one episode only. Can anybody help on this?
Here is the code for agentOptions and TrainingOptions respectively:
agentOpts = rlDDPGAgentOptions(...
'SampleTime',Ts,...
'TargetSmoothFactor',1e-3,...
'DiscountFactor',1.0, ...
'MiniBatchSize',64, ...
'ExperienceBufferLength',1e6);
agentOpts.NoiseOptions.Variance = 0.3;
agentOpts.NoiseOptions.VarianceDecayRate = 1e-5;
agentEMS = rlDDPGAgent(actor,critic,agentOpts);
maxepisodes = 20;
maxsteps = ceil(Tf/Ts);
trainOpts = rlTrainingOptions(...
'MaxEpisodes',maxepisodes, ...
'MaxStepsPerEpisode',maxsteps, ...
'ScoreAveragingWindowLength',20,...
'Verbose', false, ...
'Plots','training-progress',...
'StopTrainingCriteria','AverageReward',...
'StopTrainingValue',1000);
  1 Kommentar
Thoriq Fauzan
Thoriq Fauzan am 17 Aug. 2020
excuse me, does anyone already got some ideas about the cause of this? I have an almost similar problem

Melden Sie sich an, um zu kommentieren.


Atriya Biswas
Atriya Biswas am 5 Nov. 2019
Actually I contacted for MATLAB technical support and I sent MATLAB the whole model so that they can reproduce and do the root-cause analysis. After the root cause analysis they found that source of the unexpected error.
According to MATLAB, "The error is due to a bug in Simulink R2019a Configuration Parameters which do not recognize variables for Simulation start time". In my Simulink model: the 'sim_start' variable was used as the Simulation start time and that was the source of error. MATLAB suggested me to use numerical value "0" as the start time instead of "sim_start" variable.
If anyone is using MATLAB 2019a, it is advised not to use any variable for start time of the simulation.

Produkte


Version

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by