Kullback-Leibler Divergence for NMF in Matlab
10 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
fadams18
am 3 Jan. 2019
Beantwortet: Matt Tearle
am 16 Jan. 2019
I am trying to write the KLDV equation in matlab by looking at how the Euclidean distance was written.
- Euclidean distance for matrix factorization has the following structure.
where X is the original matrix and X_hat is a product W*H
which reduces to this matlab code
f = norm(X - W * H,'fro')^2
Now I have the Kullback-Leibler Divergence with structure as below
where X is the original matrix and X_hat is a product W*H
I wish to write this in matlab. But I am confused how to deal with the sumation. like in the Euclidean distance suddenly we are using the function norm.
Could someone help me write a decent code for this expression? Thanks.
0 Kommentare
Akzeptierte Antwort
Matt Tearle
am 16 Jan. 2019
If X and X_hat are just matrices, then I think you should be able to compute all the terms element-wise and sum the result (unless I misunderstand the formula).
div = X .* log(X ./ X_hat) - X + X_hat;
KLD = sum(div,'all'); % in R2018b onward
KLD = sum(div(:)); % in any version
I'm interpreting "log" in the formula in the math sense (natural log) rather than engineering (base 10). If it's base 10, then use the log10 function instead.
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Linear Algebra finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!