Filter löschen
Filter löschen

How to avoid Inf values when writing deep learning code?

3 Ansichten (letzte 30 Tage)
ferda sonmez
ferda sonmez am 29 Mär. 2019
Hi,
I wrote a deep learning code including the following Softmax function. During the training I start to get Inf values (and thus NaN values) in some matrix multiplication operations or as the result of softmax operation.
I also tried other softmax implementations which I found on the internet and books with no improvement.
Having these NaN values even in the first training epoch and in the very initial samples (such as in the 5. th sample) causes a false training of the model.
In order to simplfy my question I didn't add information related to the number of nodes in the input, output and hidden layers, cause I thing that this problem occurs independent of these numbers. If requested I may provide more info..
Best Regards,
Ferda Özdemir Sönmez
function y = Softmax(x)
ex = exp(x);
y = ex/sum(ex);
end

Antworten (0)

Produkte


Version

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by