gradient descent for custom function
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I have four equations:
1) e = m - y
2) y = W_3 * h
3) h = z + W_2 * z + f
4) f = W_1 * x
I want to update W_1, W_2 and W_3 in order to minimize a cost function J = (e^T e ) by using gradient descent.
x is an input, y is the output and m is the desired value for each sample in the dataset
I would like to do W_1 = W_1 - eta* grad(J)_w_1
W_2 = W_2 - eta* grad(J)_w_2
W_3 = W_3 - eta* grad(J)_w_3
Going through documentation I found out that you can train standard neural networks. But notice that I have some custom functions, so I guess it would be more of an optimization built in function to use.
Any ideas?
2 Kommentare
Matt J
am 24 Apr. 2024
x is an input, y is the output and m is the desired value for each sample in the dataset
It looks like z is also an input. It is not given by any other equations.
Antworten (2)
Matt J
am 24 Apr. 2024
Bearbeitet: Matt J
am 24 Apr. 2024
so I guess it would be more of an optimization built in function to use.
No, not necessarily. Your equations can be implemented with fullyConnectedLayers and additionLayers.
3 Kommentare
Torsten
am 24 Apr. 2024
Verschoben: Torsten
am 24 Apr. 2024
e = m - y = m - W_3*h = m - W_3*(z + W_2 * z + W_1 * x )
Now if you formulate this as
e = W1*z + W2*x - m
with
W1 = W_3 + W_2*W_3 and W2 = W_1*W_3
your problem is
min: || [z.',x.']*[W1;W2] - m ||_2
and you can use "lsqlin" to solve.
10 Kommentare
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!