Shannon entropy (information theory)

8 Ansichten (letzte 30 Tage)
Saravanan Mani
Saravanan Mani am 3 Jul. 2019
Kommentiert: Akira Agata am 4 Jul. 2019
I want to calculate the shannon entropy. X transmits random binary sequense (e.g 1000110010) and Y received (e.g 1000100010) with probability of 2%. Could some explain me how can I calculate the shannon entropy.
  1 Kommentar
Akira Agata
Akira Agata am 4 Jul. 2019
Do you mean 'Channel capacity' based on the Shannon-Hartley theorem assuming 2% BER?
You don't need to use received binary sequense Y to calculate Shannon entropy, which can be determined by the probability of '0' and '1' in the transmitted binary sequense.

Melden Sie sich an, um zu kommentieren.

Antworten (0)

Kategorien

Mehr zu Biomedical Imaging finden Sie in Help Center und File Exchange

Tags

Produkte

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by