Load a pretrained neural network object in rlNeuralNetworkEnvironment
    5 Ansichten (letzte 30 Tage)
  
       Ältere Kommentare anzeigen
    
Hi,
I want to train an RL MBPO Agent that samples from a model. The model is a trained DL object, trained in matlab. I am wondering how I can load its weights inside the env object. The examples for rlNeuralNetworkEnvironment can be used to define a network structure but I would like to add my weights to this?
Best Regards,
Vasu
0 Kommentare
Antworten (1)
  Emmanouil Tzorakoleftherakis
    
 am 21 Dez. 2023
        Hi Vasu,
You can use a pretrained environment model with MBPO agent as follows:
1) Create a rlContinuousDeterministicTransitionFunction with the trained dlnet if it is deterministic or rlContinuousGaussianTransitionFunction if it is stochastic (mean heads and std heads). 
2)  After that, you need to create rlNeuralNetworkEnvironment with newly defined function from 1.
3) Create MBPO agent.
4) Set LearnRate = 0 in  TransitionOptimizerOptions in rlMBPOAgentOptions to avoid updating the models during training.
Hope this helps
0 Kommentare
Siehe auch
Kategorien
				Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

