Solving nonlinear equation using lsqnonlin
Ältere Kommentare anzeigen
Hi
I am trying to solve a set of non-linear equations using lsqnonlin. Initially I was using 'levenberg-marquardt', but I was getting arbitrary results than what I was supposed to get. Then I tried solving using 'lsqnonlin' by defining some upper and lower bounds for the variables. However, the variables are now clinging to either the upper or the lower bound value. Therefore, the reults once again are not correct. What might be the reason for such behaviour?
Here is the code that I am using: (There are 36 to 56 variables. I am showing just 2 for example)
lb=[0.5,-0.5];
ub=[1.5,0.5];
x0=[1,0];
options=optimoptions(@lsqnonlin,'Algorithm','trust-region-reflective');
[x,res]=lsqnonlin(fun,x0,lb,ub,options);
2 Kommentare
Ameer Hamza
am 28 Nov. 2020
Bearbeitet: Ameer Hamza
am 28 Nov. 2020
This is likely caused by your objective function 'fun'. It may be unbounded, so the optimal solution is clinging to the extreme limits.
Anuj Kumar Sahoo
am 30 Nov. 2020
Akzeptierte Antwort
Weitere Antworten (1)
Ameer Hamza
am 30 Nov. 2020
0 Stimmen
As Walter already explained that there is no guaranteed way to get a globally optimum solution for an arbitrary problem using a numerical method. Because of the way they are formulated, gradient-based methods can at best reach a locally optimal solution depending on the initial guess. Some metaheuristic optimizers, such as genetic algorithm ga(), or particle swarm optimizer particleswarm() have a higher chance of reaching a global solution, but still, the globally optimum solution can never be guaranteed. However, you can increase the probability in several ways. Global Optimization toolbox gives necessary tools for that.
For example, see GlobalSearch(): https://www.mathworks.com/help/gads/globalsearch.html or multistart(): https://www.mathworks.com/help/gads/multistart.html functions. They provide a systematic way to run the common optimizers, such as fmincon(), with several different starting points hoping that this will lead to a global solution. Similarly, check ga(): https://www.mathworks.com/help/gads/ga.html and particleswarm(): https://www.mathworks.com/help/gads/particleswarm.html.
8 Kommentare
Anuj Kumar Sahoo
am 2 Dez. 2020
Ameer Hamza
am 2 Dez. 2020
Without being able to run this example, it is difficult to say anything. Can you attach a small example, which we can run. It will make it easier to suggest anything.
Anuj Kumar Sahoo
am 2 Dez. 2020
Bearbeitet: Walter Roberson
am 2 Dez. 2020
Walter Roberson
am 2 Dez. 2020
You are using abs() . However, the derivative of abs() is discontinuous, which is a problem for fsolve(). Especially with you using abs()^2 you might be able to rewrite the calculations without using abs() -- in particular, abs(a)^2 == a.*conj(a)
Anuj Kumar Sahoo
am 3 Dez. 2020
Anuj Kumar Sahoo
am 3 Dez. 2020
Walter Roberson
am 5 Dez. 2020
The editor points out that you assign to gblce and gblco but you do not use those variables. But on the line after you assign to gblce you use blce and on the line after you assign to gblco you use blco . Is it possible that you should have been using the g* version of the variables on those lines?
Anuj Kumar Sahoo
am 5 Dez. 2020
Kategorien
Mehr zu Linear Least Squares finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!