symbolic derivation - toy example: OLS regression
5 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I want to apply matlabs symbolic toolbox to find solutions to some matrix calculus problems.
As a toy example I would like to derive the optimal weights in an ordinary least squares regression problem. Linear Regression as well as it's analytic solution are widely known and thus hopefully provide a relatable problem case.
Derivation by Hand
Starting from the Least Squares Error as minimization objective it follows:
We derivate with respect to the weights:
Set to zero and solve for w:
Derivation using the Symbolic Toolbox
Now how can this be done using the symbolic toolbox?
We can specify y, w and X as
X = sym('X', [3 5], 'real')
y = sym('y', [5 1], 'real')
w = sym('w', [3 1], 'real')
It would be favourable to work in arbitrary dimensions p and q. Is this possible? (No documentation found)
State the Least Squares Error:
E1(w) = norm(y - X'*w)
E2(w) = (y - X'*w)' * (y - X'*w)
E3(w) = y'*y- 2 * w'*X*y + w'*X*X'*w
However if I subtract one from another there are terms remaining, but we know they should cancle out. Where am I going wrong here?
Continuing with the gradient with respect to the vector w gives:
Edw = gradient(E3,w)
Setting it to zero and solve for w:
solve(Edw == 0, w)
This returns a struct with 3 fields, each of shape 1x1 with no values.
But I know the solution should be
.
How would you approach this problem?
More generally is it possible to work on a matrix representation level rather than on it's elementwise formulation? In the derivation I did by hand I did not have go into the details of expanding out every matrix multiplication either. It is much cleaner that way.
Thank you in advance
0 Kommentare
Antworten (0)
Siehe auch
Kategorien
Mehr zu Calculus finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!