Average code length and entropy
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
Ghady Hajj
am 16 Nov. 2018
Beantwortet: Anshuman
am 24 Aug. 2024
Hello,
I have an uin16 vector and I got the Huffman code from the built-in functions in MATLAB. The thing is that the entropy of this file is different from the average code length that I've got from the Huffman code. Isn't the average code length supposed to be equal to the entropy of the file?
Thanks.
0 Kommentare
Akzeptierte Antwort
Anshuman
am 24 Aug. 2024
Hi,
That's not always true, Huffman coding does not always achieve the entropy limit because it works with integer code lengths. Since Huffman codes must be whole numbers of bits, the average length can exceed the entropy slightly. Entropy represents the ideal case which might involve fractional bits. Also, If the source has a non-uniform distribution or symbols with probabilities that are not powers of two, the Huffman code's average length will typically be greater than the entropy.
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Source Coding finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!