Shannon entropy (information theory)
8 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I want to calculate the shannon entropy. X transmits random binary sequense (e.g 1000110010) and Y received (e.g 1000100010) with probability of 2%. Could some explain me how can I calculate the shannon entropy.
1 Kommentar
Akira Agata
am 4 Jul. 2019
Do you mean 'Channel capacity' based on the Shannon-Hartley theorem assuming 2% BER?
You don't need to use received binary sequense Y to calculate Shannon entropy, which can be determined by the probability of '0' and '1' in the transmitted binary sequense.
Antworten (0)
Siehe auch
Kategorien
Mehr zu Biomedical Imaging finden Sie in Help Center und File Exchange
Produkte
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!