In this post we will discuss some fundamentals of source coding and in later posts we will write codes for some source coding algorithms. We will like to remind our readers that we are right now discussing the basics of simulation and soon we will go for OFDM and more advance topics. So lets hit it!
Mutual Information:
The information content provided by the occurrence of the event Y = yi about the event X = xi is defined as
I(xi;yj) is called the mutual information between xiand yj.
Self Information:
when we talk about the information of a single event X= xi we called it self information and is denoted as
it is a matter of fact, that , a high probability event conveys less information than a low probability event. That is, P(x) = 1 which means I(x) = 0.
Conditional Self Information:
In addition to self information and mutual information, conditional self information as
Average Mutual Information:
the mutual information associated with the pair of the events (xi,yj), which are possible outcomes of the two random variables X and Y, we can obtain the average value of the mutual information by simply weighting I(xi,yj) by the probability of occurrence of the joint event and summing over all possible joint events.
Mathematically
as a matter of fact, I(X;Y) = 0 when X and Y are statistically Independent. Also I(X;Y) ≥ 0.
Entropy:
Entropy or Average self information denoted as H(X) and defined as
Conditional Entropy:
the average conditional self information is called conditional entropy
we also get the relation from the above the equations
Here is a plot of binary conditional entropy produced in matlab. In next post we will discuss some left over theorems of source coding like Kraft's inequality and will start some real coding work !
Post a Comment