Finite Difference Method to find gradient?
12 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I need to find the gradient of an n-dimensional function to eventually optimize it bu I need to be doing it by finite difference method. I can't use the built-in matlab functions but I have no idea how to code finite difference for n-dimensions. The function should be entered as x(1) x(2) and so on (so that the loops can calculate the gradient) and the dimension of the function will be found from the size of the starting point vector. Can someone help me out in this please? I really don't know how to code this.
1 Kommentar
Star Strider
am 22 Nov. 2015
Bearbeitet: Star Strider
am 22 Nov. 2015
This Stack Exchange post should get you started: calculate Jacobian matrix without closed form or analytical form.
Antworten (0)
Siehe auch
Kategorien
Mehr zu Nonlinear Optimization finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!