# Nonlinear optimization by using MATLAB built in functions

5 views (last 30 days)
Giovanni on 1 Oct 2015
Commented: Giovanni on 2 Oct 2015
Hello everybody,
I have a question regarding the optimization algorithms. I have used some of them so far, in order to do some system identification. At the moment I am trying to use the optimizer to find the optimal inputs to system that I am simulating. Basically the output of the system responds to N input parameters, I have a target value for the output of my system and I define my error function as the difference between the output value obtained by the current combination of parameters at the current iteration. Basically after every new simulated point, I check the gradient of the new values, and I use those gradients to set the next step, basically a gradient descent optimization. Normally one could do this in MATLAB by using the optimizer built in functions, but they always request the knowledge of the function to optimize and the gradient...is there any way to use those functions without having the whole regression matrixes and vectors in advance? or do I need to develop myself those iterative solutions?

John D'Errico on 1 Oct 2015
Edited: John D'Errico on 1 Oct 2015
No. The tools in MATLAB absolutely do NOT require knowledge of the gradient!!!!!!
In general, the optimization tools in MATLAB require for the function nothing more sophisticated than a "black box", a function that you can pass parameters into and it returns an objective value.
As it is, you claim that you are computing the gradient anyway. So why would that be a problem? That you are using what is essentially steepest descent suggests that you would be far better off using a more intelligent tool.
Since I do not know what problem you are solving, nor how many parameters are involved in the estimation, I cannot suggest an optimizer. fminsearch is a simple choice, if you have no constraints, and up to roughly 6 or so parameters. Larger problems would normally go to one of the routines in the optimization toolbox. Or you can always look to the file exchange if you lack that toolbox.

Giovanni on 2 Oct 2015
Hi John,
thanks for your answer. I can try to better explain what I doing so it becomes more clear what is my concern. I have interfaced MATLAB with an external simulation environment, which is the system that I want to optimize locally. My system takes three parameters as an input (with also three different ranges of variation) and gives me back the output value for that combination. Basically I get a point out of every iteration, and what I am trying to do is using this information to run my optimization. So at every loop I just have available the current values of the three variables and of the output value. I want to use this information to iteratively reach the minimum value of my error function, which is defined as the distance of the current output to the target value I am looking for. The problem is that apparently with steepest descent I don´t get good performances (the fixed alpha value, the gradient looks tricky), so I would like to use another algorithm. Many of the functions in the optimization toolbox are already iterative by themself I guess, is there anything that I could use to perform a direct search optimization? I hope I could give a better overview about my doubt.
John D'Errico on 2 Oct 2015
With only 3 parameters, as I said, fminsearch will be entirely adequate. It will be better than steepest descent, which is generally a poor choice for optimization.
Your objective function for fminsearch would be something like (y_target - f(X))^2, which is minimized at the target value. Of course, you could still use tools from the optimization toolbox.
In any case you will encounter one serious problem. There will be infinitely many solutions, so dependent on your starting point, you will probably find a different solution.
You are essentially trying to solve one equation in three unknowns. The result will be an infinite set of solutions in general. Think of it as a surface in the 3-dimensional parameter space. Or you might call it a 2-manifold, embedded in the 3 dimensional parameter space.
For example, what is the solution to the problem
f(x,y,z) = x^2 + y^2 + z^2 = 2
One can describe the solution locus as the surface of a sphere. If you used an optimizer to solve the problem as you intend to do, then the answer would depend entirely on where you started the solver. It will find a solution on the surface of the sphere, but which point depends on where you start.
Giovanni on 2 Oct 2015
Hello John. Thanks again for your precious comment. I have generated a function that calls my simulator with some initial vlues for my parameters. Then I have defined my error function as the absolute value of my target vale minus the value of the function obtained by the simulation. It wasn´t just clear to me at the beginning how to define my target function and how to properly use the fminsearch algorithm. About the solution pointing toward a local minimum, I know that this could come in play, but in general those functions I am working with should have in general has just one minimum. A second aspect that I didnt´t take into account is that the optimizer also allows negative values of the parameters, which is not my case. So I really guess I need to find out another method to use.