Shallow Neural Networks trainbr trainlm alternatives

16 Ansichten (letzte 30 Tage)
Brent
Brent am 8 Mai 2021
Kommentiert: Brent McWatters am 16 Mai 2021
I've found the results from trainbr to be much better than trainlm for my particular use but trainbr is taking a couple hours during the training process and can't take advantage of a GPU.
Can you recommend any alternatives to trainbr that might provide similar quality results yet run faster?
Thank you!

Antworten (1)

Divya Gaddipati
Divya Gaddipati am 13 Mai 2021
As the trainbr function involves a Jacobian operation, it is currently not possible to implement it on a GPU.
There are other multiple algorithms that are supported on GPU. You can find the list below using this command:
help nntrain
Neural Network Training Functions. To change a neural network's training algorithm set the net.trainFcn property to the name of the corresponding function. For example, to use the scaled conjugate gradient backprop training algorithm: net.trainFcn = 'trainscg'; Backpropagation training functions that use Jacobian derivatives These algorithms can be faster but require more memory than gradient backpropation. They are also not supported on GPU hardware. trainlm - Levenberg-Marquardt backpropagation. trainbr - Bayesian Regulation backpropagation. Backpropagation training functions that use gradient derivatives These algorithms may not be as fast as Jacobian backpropagation. They are supported on GPU hardware with the Parallel Computing Toolbox. trainbfg - BFGS quasi-Newton backpropagation. traincgb - Conjugate gradient backpropagation with Powell-Beale restarts. traincgf - Conjugate gradient backpropagation with Fletcher-Reeves updates. traincgp - Conjugate gradient backpropagation with Polak-Ribiere updates. traingd - Gradient descent backpropagation. traingda - Gradient descent with adaptive lr backpropagation. traingdm - Gradient descent with momentum. traingdx - Gradient descent w/momentum & adaptive lr backpropagation. trainoss - One step secant backpropagation. trainrp - RPROP backpropagation. trainscg - Scaled conjugate gradient backpropagation. Supervised weight/bias training functions trainb - Batch training with weight & bias learning rules. trainc - Cyclical order weight/bias training. trainr - Random order weight/bias training. trains - Sequential order weight/bias training. Unsupervised weight/bias training functions trainbu - Unsupervised batch training with weight & bias learning rules. trainru - Unsupervised random order weight/bias training. Main nnet function list.
  1 Kommentar
Brent McWatters
Brent McWatters am 16 Mai 2021
Thank you for the list of alternative algos.
My question was: Can you recommend any alternatives to trainbr that might provide similar quality results yet run faster?
Is there any information available or that you can provide as to which of the few dozen approaches I should try?

Melden Sie sich an, um zu kommentieren.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by