Logging needed Information while training a Reinforcement learning agent.

2 Ansichten (letzte 30 Tage)
Hello everyone
Currently I am using RL to develop an algorithm to optimize 3D movementof a UAV using certaint models and calculations. some of the data are used for reward and some are not but they are important for my research. I found that i could save those in loggedsignals, but as it gets bigger, it slows the training after a while, although Its useful for one episode or one simulation. Is there a better way to log or save these unused yet important data?
Thank you

Akzeptierte Antwort

Ari Biswas
Ari Biswas am 1 Mär. 2024
Unfortunately there is no straightforward way to do this currently but we may have a solution in the upcoming releases (stay tuned).
For now you can add the following code in your environment reset function. It will save the logged signals to disk and remove it from memory to improve performance.
currentDir = pwd;
env.ResetFcn = @(in) myResetFcn(in, currentDir);
function in = myResetFcn(in, currentDir)
% your reset code...
% create a unique name for each episode.
% For parallel training use uuid to avoid incorrect episode indices.
% s = matlab.lang.internal.uuid;
s = string(datetime("now"),"yyyyMMddhhmmss");
filename = fullfile(currentDir, "loggedData_" + s + ".mat");
% process the logged data post episode simulation
in = in.setPostSimFcn(@(x) myPostSim(x, filename));
end
function new_out = myPostSim(out, filename)
% save the logged data to disk
% "logsout" is the name specified in your model settings > Data Import/Export > Signal logging
loggedData = out.logsout;
save(filename, "loggedData");
% remove logged data from out
new_out = out;
new_out.logsout = [];
new_out.tout = []
end

Weitere Antworten (0)

Kategorien

Mehr zu Deep Learning with Simulink finden Sie in Help Center und File Exchange

Produkte


Version

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by