Filter löschen
Filter löschen

Optimization issue, always different results?

12 Ansichten (letzte 30 Tage)
Daniel Valencia
Daniel Valencia am 26 Jan. 2021
Kommentiert: Daniel Valencia am 26 Jan. 2021
Hello everyone,
I'm trying to optimize a set of 3 parameters by using the following function:
options=optimset('Algorithm','levenberg-Marquardt', 'LargeScale','off', 'DiffMaxChange',0.01, 'DiffMinChange',0.0001, 'TolFun',1e-5, 'TolX',0.001);
[x,resnorm,residual,exitflag,output,lambda,jacobian]=lsqnonlin(@HSfun,x0,lb,ub,options);
At the end of the script, the function is reduced to a subtraction of vectors, so I do not consider important to post the whole script here. Input data is brought to the script from excel, and that's it. 3 parameters are optimized until reaching one of the optimization stopping criteria.
Altought I utilize Trust Reflective Region (default optimization Algorithm) instead of levenberg-Marquardt, the results of the optimization are always different for the same input data. And the results are sometimes more accurate than others.
My question is: Is this normal? These functions give you always different parameters answers even if the input data to optimize is the same?
Thanks in advance!

Akzeptierte Antwort

Walter Roberson
Walter Roberson am 26 Jan. 2021
Yes, it is normal for different algorithms for lsqnonlin to give different results. lsqnonlin is not a global optimizer and the different algorithms have different search strategies. For any given starting point, there are functions for which a given algorithm may get stuck when the others do not.
Especially with non-linear functions, different algorithms may give different emphasis to different variables. That can result in seemingly very different values for a variable that turns out not to contribute much (and so the difference in function value is not high), and it can result in very different values for a variable that makes a significant difference in its best range, in the situation where there are locations where that variable does not make much difference and so might not have it range explored as much in one of the algorithms.
When you use lsqnonlin you should rarely assume that the result you get out is the global minimum.
  3 Kommentare
Walter Roberson
Walter Roberson am 26 Jan. 2021
What is a "global local minimum" ? Those are contradictions.
It has been proven that there are functions that there is no possible algorithm to decide on the global minima, functions for which knowing the values of any finite number of other locations does not give you information about the value at any specific location -- functions for which the only way to find the global minima is to try every valid input one by one.
There are many many nonlinear functions for which formal analysis of the formulas becomes effectively intractable. You might be able to get some broad hints for them looking at their formulas, but that might only restrict your search to (say) 10^140000.
But formal analysis of formulas implies that you have the ability to reason about formulas, which implies symbolic work. Any purely numeric work cannot do that. And all of the optimizers such as lsqnonlin are numeric optimizers, so they each are "black box" optimizers: all the algorithm can do is pass values into the "black box", let the black box calculate by some unknown means, and try to figure out what is going on from the numeric results. Unfortunately black box optimizers can never be sure that they have found the global minima. Consider for example if the user asks to minimize:
@(x) x.^2 - 1000.*(x == 3.13432091)
If the optimizer does not happen to pass in 3.13432091 exactly then it will never get out the negative value the function evaluates to there, so as far as any numeric optimizer can tell, the minima is going to be at x = 0 instead of at 3.13432091 .
So... when you are using pure numeric optimization techniques, you cannot be sure of finding global minima even for functions that are not very complicated, and numeric optimization routines can struggle to optimize "bumpy" functions even if there is no "trick" to the function.
Daniel Valencia
Daniel Valencia am 26 Jan. 2021
Walter, Thank you very much. I understand now.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Produkte

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by