Load data into experience buffer: DDPG agent
Ältere Kommentare anzeigen
I am using RL toolbox version 1.1 with Matlab R2019b and using the DDPG agent to design a controller. Is there a way to load in data (state, action, reward, next state) collected from real experiments into the experience buffer before startting training?
Antworten (2)
JiaZheng Yan
am 31 Mär. 2020
1 Stimme
I find a way to show the Memory of the experience buffer.
You can open the file "ExperienceBuffer.m", which is in "...\Matlab\toolbox\rl\rl\+rl\+util".
In this file, you can the property value of the variable Memory. For example:

Then you set:
agentOpts.SaveExperienceBufferWithAgent = true;
agentOpts.ResetExperienceBufferBeforeTraining = false;
After your training, you can get the data in ''agent.ExperienceBuffer.Memory''
This also means that you can modify and use the training data.
I hope this method works for you : )
8 Kommentare
Ao Liu
am 3 Jul. 2020
你好,我现在用的2020a的工具箱也遇到了这个问题,按照你的这样改,好像不行,请问有什么解决方法吗!感谢!

JiaZheng Yan
am 4 Jul. 2020

我尝试了一下,Memory变量的属性设置在所示的两个位置均可实现其显示及调用。
记得在创建agent的训练选项预设的代码中,加入以下代码:
agentOpts.SaveExperienceBufferWithAgent = true; %(这一句是关键)
agentOpts.ResetExperienceBufferBeforeTraining = false;
训练完成后,你应该就可以看见Memory元素。


Memory的长度可能会影响数据显示,通过抽样赋值可以看见更详细的数据:
a=agent.ExperienceBuffer.Memory{1}%取一个元素查看

分别表示 (state, action, reward, next state,is_done)
保存这个agent,在下次的训练时加载agent,就可以实现agent的反复训练
Fabian Hart
am 21 Jul. 2020
Thanks for your answer!
Unfortunately I have problems to write on Matlab system files. When I try to do that the following message appears:
"Error writing ExperienceBuffer.m .... Acces denied"
Could you please tell me how you managed that? (Windows 10)
JiaZheng Yan
am 23 Jul. 2020
Sorry, I hope you can provide a more detailed description or a screenshot of the error report, because I have never made such an error.
(I guess it's a file path problem)
zhou jianhao
am 25 Okt. 2021
Hi, Jiazheng.
Your method is working, thanks a lot!
Still one thing bother, is there any method that I can access the memory buffer during training, I mean if I want to use prioritized experience replay.
Hope to hear from you soon.
Thanks!
Regards!
zhou jianhao
am 25 Okt. 2021
I have to say temperally the RL toolbox in matlab is easy to use but hard to obtain satisfactory performance, so far from python platform.
zimeng Wang
am 1 Apr. 2022
您好,我可以用这个方法查看buffer里的数据,但是如何修改或者删除buffer里的数据?
Arman Ali
am 1 Aug. 2022
have you found the answer? if yes please guide?
Priyanshu Mishra
am 26 Feb. 2020
0 Stimmen
Hi Daksh,
Kategorien
Mehr zu Programming finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!