TECHNOLOGY INSPIRATION
Technology-People-Innovation

Tutorial VI: Source Coding Fundamentals I

 In this post we will discuss some fundamentals of source coding and in later posts we will write codes for some source coding algorithms. We will like to remind our readers that we are right now discussing the basics of simulation and soon we will go for OFDM and more advance topics. So lets hit it!

Mutual Information:

The information content provided by the occurrence of the event Y = yi about the event X = xi is defined as

Mutual Information

Mutual Information

I(xi;yj) is called the mutual information between xiand yj.

Self Information:

when we talk about the information of a single event X= xi we called it self information and is denoted as

Self Information

Self Information

it is a matter of fact, that , a high probability event conveys less information than a low probability event. That is, P(x) = 1 which means I(x) = 0.

Conditional Self Information:

In addition to self information and mutual information, conditional self information as

Conditional Self Information

Conditional Self Information

Average Mutual Information:

the mutual information associated with the pair of the events (xi,yj), which are possible outcomes of the two random variables X and Y, we can obtain the average value of the mutual information by simply weighting I(xi,yj) by the probability of occurrence of the joint event and summing over all possible joint events.

Mathematically

Average Mutual Information

Average Mutual Information

as a matter of fact, I(X;Y) = 0 when X and Y are statistically Independent. Also I(X;Y) ≥ 0.

Entropy:

Entropy or Average self information denoted as H(X) and defined as

Entropy

Entropy

Conditional Entropy:

the average conditional self information is called conditional entropy

Conditional Entropy

Conditional Entropy

we also get the relation from the above the equations

Relation Between Information and Entropy

Relation Between Information and Entropy

Here is a plot of binary conditional entropy produced in matlab. In next post we will discuss some left over theorems of source coding like Kraft's inequality and will start some real coding work !

binary entropy function

binary entropy function

Post a Comment

[blogger]

Contact Form

Name

Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget