gradient descent with noisy data
7 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Hello. I am trying to fit a model to experimental data. The problem is that I am using a generative model, i.e. I simulate predictions for every set of parameters. It is very slow because every iteration takes about 20 seconds. Moreover predictions are a bit noisy and Matlab's gradient descent algorithms seem to have difficulties to converge (fminsearch and fmincon). Is there an algorithm known to be more robust (less sensitive to noise) than the other ones? Thanks. Baptiste
0 Kommentare
Antworten (2)
Mohammad Abouali
am 16 Jan. 2017
Bearbeitet: Mohammad Abouali
am 16 Jan. 2017
Try one of the optimization methods in the global optimization toolbox, such as Particle Swarm or Genetic Algorithm
3 Kommentare
Alan Weiss
am 16 Jan. 2017
Bearbeitet: Alan Weiss
am 16 Jan. 2017
No, all optimization functions (except lsqcurvefit) take exactly the same form of objective function. fminsearch takes a SINGLE argument to fun, but that single argument can be a vector or array.
So if it works for you with one solver but not another, then something else is going on. Please show us exactly how you are calling ga and fminsearch and particleswarm.
x = fminsearch(@(parameters)fun(parameters,data,...)
then exactly that same function handle should work for all other solvers.
Alan Weiss
MATLAB mathematical toolbox documentation
John D'Errico
am 16 Jan. 2017
First of all, fminsearch is NOT a gradient descent algorithm. Calling it that does not make it one.
Second, large residual problems are classically a bane for nonlinear least squares. This is well known. Ok, it should be well known, as I recall reading about the issues 35 years ago or so. For example:
Note the date.
Do you want to use particle swarms or genetic algorithms or any other stochastic optimizer? Not really a good idea, IMHO, since those schemes use LOTS of extra function evaluations while still walking down hill. They are as much (or little) a gradient descent algorithm as is fminsearch. They can be more slowly convergent in general though.
I don't have your model at hand, so it is somewhat difficult to make constructive suggestions. My first choice to improve robustness of large residual problems would be a partitioned nonlinear least squares tool. But that requires the ability to partition the unknowns into a conditionally linear subset, and an intrinsically nonlinear subset. Since your model is a simulation, that may well not be an option.
My second suggestion is to use a robust solver. Nlinfit from the stats toolbox does offer a robust option.
Third, you will benefit greatly from good starting values for large residual problems.
Siehe auch
Kategorien
Mehr zu Genetic Algorithm finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!