rlDiscreteCategoricalActor not accepting a mix of rlNumericSpec and rlFiniteSetSpec objects - observation for a RL environment
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
I am looking for an example of which implements a mix of rlNumericSpec and rlFiniteSetSpec object in an RL environment (as mentioned here). Some of my observations are numerical/continuous whereas others are finite/discrete.
I created a set of obervations which is mixture of rlNumericSpec and rlFiniteSetSpec objects using the following code:
obsInfo_numeric = rlNumericSpec([4 1]);
obsInfo_finite = rlFiniteSetSpec([1 1]);
obsInfo = [obsInfo_numeric,obsInfo_finite];
and a set of actions using:
actInfo = rlFiniteSetSpec([1 2 3 4 5]);
I also created a network called 'actnet' with 4 inputs and 1 output:
But when I try to create an actor using the observations and actions, I am getting an error:
actor = rlDiscreteCategoricalActor(actnet,obsInfo,actInfo);
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1380784/image.png)
0 Kommentare
Antworten (1)
Narvik
am 25 Aug. 2023
Hi,
I understand that you faced an issue when using a combination of discrete('rlFiniteSetSpec') and continuous('rlNumericSpec') observation data specifications. The function 'rlDiscreteCategoricalActor' accepts a combination discrete and continuous observation data specifications. Please find an example in the documentation below :
I advise you to check your neural network and action space and make sure that the input layers match the number of observation channels. Please find some helpful documentation links provided below :
https://in.mathworks.com/help/reinforcement-learning/ref/rl.function.rldiscretecategoricalactor.html
Hope this helps!
1 Kommentar
Siehe auch
Kategorien
Mehr zu Agents finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!