objective function in Bayesian Optimization Algorithm like fitrsvm and fitrgp
6 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Dimitri
am 13 Mai 2019
Kommentiert: antlhem
am 29 Mai 2021
Hello,
What is the mathematical objective function in the bayesian optimization algorithm? The explanation says that the algorithm like fitrsvm tries to minimize the log(1 + cross-validation loss) but what is the real mathematical formula?
Is it possible to change the objective function to just the MSE?
Thank you!
Dimitri
0 Kommentare
Akzeptierte Antwort
Don Mathis
am 13 Mai 2019
This page says that the loss defaults to MSE. So that's the loss that's used in the log(1+cvloss) formula. Cross validated loss is the loss summed over all the held-out validation sets. The default when using optimization is 5-fold cross-validation.
There's not an option to change the hyperparameter optimization objective function from log(1+cvloss). You would need to edit the source code to do that. The source file is matlab\toolbox\stats\classreg\+classreg\+learning\+paramoptim\createObjFcn.m. Look for the call to the log1p function.
3 Kommentare
Don Mathis
am 14 Mai 2019
Bearbeitet: Don Mathis
am 14 Mai 2019
Because loss(Mdl,X,Y) is the loss of the final model on the full dataset, while the MinObjective is the log of 1 plus the out-of-sample cross-validated loss. See the kfoldLoss method for documentation of that. If you used 5-fold cross-validation, the kfoldLoss is the summed loss of 5 different models, each on 1/5 of the dataset. It is not the loss of the final model on the full dataset.
antlhem
am 29 Mai 2021
Hi, Could take a look into my question? https://uk.mathworks.com/matlabcentral/answers/842800-why-matlab-svr-is-not-working-for-exponential-data-and-works-well-with-data-that-fluctuates?s_tid=prof_contriblnk
Weitere Antworten (0)
Siehe auch
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!