I am getting an error when trying to train an RL agent in matlab, I am using MATLAB 2024a
28 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I am trying to run the following example in Matlab:
openExample('rl/TrainTD3AgentForPMSMControlExample')
It gives me the following error:
Error in 'mcb_pmsm_foc_sim_RL/Current Control/Input Scaling/ Calculate Position and Speed/Speed Measurement': Failed to evaluate mask initialization commands.
out = nestedRunEpisode(policy);
result = run_internal_(this);
result = run_(this);
trainResult = run(trainer);
result = run_(this);
trainingResult = run(tm);
Caused by:
Cannot change property 'Enabled' of 'mcb_pmsm_foc_sim_RL/Current Control/Input Scaling/ Calculate Position and Speed/Speed Measurement' while simulation is running
0 Kommentare
Antworten (1)
Sreeram
am 15 Nov. 2024 um 6:27
This looks like a bug to me. However, here is a workaround to unblock this:
Replace “Speed Measurement” block in ‘mcb_pmsm_foc_sim_RL/Current Control/Input Scaling/ Calculate Position and Speed’ with “Speed Measurement” block from ‘Motor Control Blockset HDL Support/Sensor Decoders’.
Make sure to set all the block parameters to be exactly same as that of the original “Speed and Measurement” block before commenting it out.
I hope this helps!
0 Kommentare
Siehe auch
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!