Analytical hessian for large scale optimization in fmincon

6 Ansichten (letzte 30 Tage)
Hi everyone,
I have currently set up an optimization problem in Matlab which stems from an inverse problem in design. I tested my setup with small test cases (decision space ~ 80 variables) and fmincon works like a charm. I did some calculations to get rough estimates on my actual optimization problem and here is what it looks like:
Decison space: ~ 20,000 variables
Nonlinear equality constraints: ~15.000
Linear inequality constraints: ~5,000
Clearly, its a large scale optimization problem and, per Matlab's recommendations, I should use the interior-point method. I would also like to note that I do have access to the gradient of the cost function and possibly Hessian (likely going to be ridiculously large, and I haven't coded it up yet). Here are a few questions that I would very much appreciate responses to:
  1. Can fmincon efficiently handle a problem such as mine, or am I better off using a dedicated commercial solver?
  2. I think I would absolutely need to use l-bfgs for my hessian approximation (when I don't pass an analytical hessian). Because l-bfgs approximates hessian using limited memory, does passing an analytical hessian worth it? The reason I ask is because I am under the impression that if I pass an analytical hessian, I will not be able to use limited memory approximation of hessian and that the interior-point algorithm is going to use the full hessian per iteration to converge to a solution.
  3. Does fmincon work if I use the sparse function to pass an analytical hessian? (I am fairly certain that my Hessian is going to be sparse)
Thank you for time. Please feel free to ask for any clarification.

Akzeptierte Antwort

Matt J
Matt J am 24 Apr. 2021
Can fmincon efficiently handle a problem such as mine, or am I better off using a dedicated commercial solver?
The only way to find out is to try...
Does fmincon work if I use the sparse function to pass an analytical hessian? (I am fairly certain that my Hessian is going to be sparse)
Yes. You can also use a Hessian Multiply Function to conserve memory,
  5 Kommentare
Matt J
Matt J am 28 Apr. 2021
As I recall it, the general theory of BFGS is that, for quadratic functions, it will converge in at most N iterations where N is the number of unknowns. With an exact Hessian of course, convergence occurs within 1 iteration.
Chaitanya Awasthi
Chaitanya Awasthi am 28 Apr. 2021
Yes, I believe that is true.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Nonlinear Optimization finden Sie in Help Center und File Exchange

Produkte


Version

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by