# wentropy

Entropy (wavelet packet)

## Syntax

``E = wentropy(X,T)``
``E = wentropy(X,T,P)``

## Description

````E = wentropy(X,T)` returns the entropy specified by `T` of the vector or matrix `X`.```

example

````E = wentropy(X,T,P)` returns the entropy where `P` is a parameter depending on `T`.```E = wentropy(X,T,0)``` is equivalent to ```E = wentropy(X,T)```. ```

## Examples

collapse all

This example shows the different values of entropy of a random signal.

For purposes of reproducibility, reset the random seed and generate a random signal.

```rng default x = randn(1,200);```

Compute the Shannon entropy of `x`.

`e = wentropy(x,'shannon')`
```e = -224.5551 ```

Compute the log energy entropy of `x`.

`e = wentropy(x,'log energy')`
```e = -229.5183 ```

Compute the threshold entropy of `x` with the threshold entropy equal to 0.2.

`e = wentropy(x,'threshold',0.2)`
```e = 168 ```

Compute the Sure entropy of `x` with the threshold equal to 3.

`e = wentropy(x,'sure',3)`
```e = 35.7962 ```

Compute the norm entropy of `x` with power equal to 1.1

`e = wentropy(x,'norm',1.1)`
```e = 173.6578 ```

You can use your own entropy function `ABC` with `wentropy`. Your function must be defined in a `.m` file, and the first line must be of the form:

`function e = ABC(x)`

where `x` is a vector and `e` is a real number. The new entropy can be used by typing

`e = wentropy(x,'user','ABC')`

or more directly

`e = wentropy(x,'ABC')`

The function file `myEntropy.m` returns the normalized Shannon entropy of a signal. Compute the normalized Shannon entropy of `x`.

`w = wentropy(x,'myEntropy')`
```w = -1.1228 ```

## Input Arguments

collapse all

Input data, specified as a real-valued vector or matrix.

Entropy type, specified as one of the following:

Entropy Type (`T`)

Threshold Parameter (`P`)

`'shannon'`

`P` is not used.

`'log energy'`

`P` is not used.

`'threshold'``0 ≤ P`

`P` is the threshold.

`'sure'``0 ≤ P`

`P` is the threshold.

`'norm'``1 ≤ P`

`P` is the power.

`'user'`Character vector

`P` is a character vector containing the file name of your own entropy function, with a single input `x`.

'`FunName`'No constraints on `P`

`FunName` is any character vector other than the previous entropy types listed.

`FunName` contains the file name of your own entropy function, with `x` as input and `P` as an additional parameter to your entropy function.

`T` and the threshold parameter `P` together define the entropy criterion.

Note

The `'user'` option is historical and still kept for compatibility, but it is obsoleted by the last option described in the table above. The FunName option do the same as the `'user'` option and in addition gives the possibility to pass a parameter to your own entropy function.

Threshold parameter, specified by a real number, character vector, or string scalar. `P` and the entropy type `T` together define the entropy criterion.

## Output Arguments

collapse all

Entropy of `X`, returned as a real number.

collapse all

### Entropy

Functionals verifying an additive-type property are well suited for efficient searching of binary-tree structures and the fundamental splitting property of the wavelet packets decomposition. Classical entropy-based criteria match these conditions and describe information-related properties for an accurate representation of a given signal. Entropy is a common concept in many fields, mainly in signal processing. The following example lists different entropy criteria. Many others are available and can be easily integrated. In the following expressions, s is the signal and (si)i the coefficients of s in an orthonormal basis.

The entropy `E` must be an additive cost function such that E(0) = 0 and

` `
• The (nonnormalized) Shannon entropy. so, ,

with the convention 0log(0) = 0.

• The concentration in lp norm entropy with 1 ≤ p.

E2(si ) = |si|p so • The “log energy” entropy. so, ,

with the convention log(0) = 0.

• The threshold entropy.

E4(si) = 1 if |si| > p and 0 elsewhere so E4(s) = #{i such that |si | > p} is the number of time instants when the signal is greater than a threshold p.

• The “SURE” entropy.

E5(s) = n - #{i such that Coifman, R. R., and M. V. Wickerhauser. "Entropy-based Algorithms for best basis selection." IEEE Transactions on Information Theory. Vol. 38, Number 2, March 1992, pp. 713–718.

 Donoho, D. L., and I. M. Johnstone. "Ideal denoising in an orthonormal basis chosen from a library of bases." Comptes Rendus Acad. Sci. Paris, Ser. I. Vol. 319, 1994, pp. 1317–1322.

## Support Get trial now