This challenge is to return the WH_delta and WP_delta, given X, WH, WP, EPY using ReLU on the hidden layer and Softmax on the output layer. Test Cases will accumulate dWP and dWH to solve neural nets for Counter, Subtractor,Mux. Test Cases will have four output cases. ReLU performs well on multiple output cases.
[dWP,dWH]=Neural_Back_Propagation_ReLU(X,WH,WP,EPY)
The matlab Latex code for making the Back Propagation chart included in template.
Solution Stats
Solution Comments
Show comments
Loading...
Problem Recent Solvers8
Suggested Problems
-
167 Solvers
-
Compress strings (not springs)
219 Solvers
-
Vectorizing, too easy or too hard?
147 Solvers
-
1065 Solvers
-
644 Solvers
More from this Author308
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!