How to visualize episode behaviour with the reinforcement learning toolbox?
9 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Jan de Priester
am 5 Jun. 2019
Bearbeitet: Emmanouil Tzorakoleftherakis
am 15 Sep. 2020
How can I create a visualization for a custom environment that shows the behaviour of the system in the environment during an episode of training? I cannot find code examples or clarifations of code that visualizes systems behaviour during training episodes anywhere on Mathworks. I would like to achieve a visualization that looks something like the cart-pole visualizer shown on this page: https://nl.mathworks.com/help/reinforcement-learning/ug/train-pg-agent-to-balance-cart-pole-system.html?searchHighlight=cart%20pole&s_tid=doc_srchtitle.
PS I am trying to solve the continuous mountain car problem with a ddpg agent with the reinforcement learning toolbox
0 Kommentare
Akzeptierte Antwort
Emmanouil Tzorakoleftherakis
am 7 Jun. 2019
Hello,
To create a custom MATLAB environment, use the template that pops up after running
rlCreateEnvTemplate('myenv')
In this template there are two methods that can be used for visualization, "plot", and "envUpdatedCallback" (it is called from within "plot"). Use "plot" to create the basic stationary parts of your visualization, and "envUpdatedCallback" to update the coordinates of the moving parts based on your states.
5 Kommentare
Prashanth Chivkula
am 15 Sep. 2020
And another question where do I define a reward function in the template
Emmanouil Tzorakoleftherakis
am 15 Sep. 2020
Bearbeitet: Emmanouil Tzorakoleftherakis
am 15 Sep. 2020
The error sounds self-explanatory - make sure whatever you are plotting makes sense.
In this template there is no separate function for rewards - it is implemented inside 'step' if you go through the generated code. You could create a separate function if you want as well.
In the future please create a separate question if it's not related to the original one. Thanks!
Weitere Antworten (0)
Siehe auch
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!