The optimization algorithms

Algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering some objective functions (1D and 2D).
144 Downloads
Aktualisiert 18. Apr 2023

Lizenz anzeigen

The optimization algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering objective functions in one and two dimensions. The Newton and quasi-Newton methods may encounter problems such as the Hessian is too complex or does not exist. The requirement to apply a matrix inversion at each iteration, this can be prohibitive for optimization problems involving many variables. These methods can therefore become impractical. An alternative is to use the family of gradient descent algorithms. These methods do not require explicit computation or Hessian approximation. A gradient descent algorithm is implemented by choosing successive descent directions and the amplitude of the descent step in the chosen direction. This family of algorithms is widely used in optimization processes of more or less complex problems. The term descent arises because these algorithms look for the extrema in an opposite direction to that of the objective function's gradient.
Explanatory algorithmic schemes are available in the user guide.

Zitieren als

Kenouche Samir (2026). The optimization algorithms (https://de.mathworks.com/matlabcentral/fileexchange/128008-the-optimization-algorithms), MATLAB Central File Exchange. Abgerufen.

Kompatibilität der MATLAB-Version
Erstellt mit R2023a
Kompatibel mit allen Versionen
Plattform-Kompatibilität
Windows macOS Linux
Version Veröffentlicht Versionshinweise
04.2023.01