In this post, we will discuss about the probability functions and we will see the important characteristics of them along with the equivalent matlab functions.
Random Variables:
Given an experiment having sample spaces S and elements s É›
S, we define a function X(s) whose domain is S and range is a set of numbers on the real line. The function X(s) is called random variable.
Probability Distribution Function / Cumulative Distribution Function (CDF)
Given a random variable X, let us consider the event {X ≤ x} where x is any real number in the interval (-∞, ∞). We write the probability of this event as P(X≤ x) and denote it simply by F(x), i.e.
F(x) = P(X≤ x) –∞ < x < ∞
Function F(x) is called Probability Distribution Function, also known asCumulative Distribution Function (CDF).
Probability Density Function (PDF)
The derivative of CDF is known as probability density function (PDF) and it is denoted as p(x). Mathematically:
Equivalently
This is for continuous variables, but in case of discrete random variables PDF is expressed as
Statistical Averages of Random variables:
The first and second moments of a single random variable and the joint moments such are correlation and covariance between any pair of random variables in a multidimensional set of random variables are of particular interests.
For all the cases below: X is single random variable and its PDF is p(x).
Mean or Expected value of X:
This is first moment of the random variable X. E() denotes expectation (statistical averaging)
The n-th moment is defined as
if mx is the mean value of the random variable X, the nth central moment is defined as
Variance:
when n = 2 the central moment is called the variance of the random variable and denoted as σ
variance provides the measure of the dispersion of the random variable X.
Joint moment:
in case of two random variables X1 and X2 with joint PDF p(x1,x2) we define joint moment as:
joint central moment as:
when k = n = 1, we get correlation and covariance of the random variables X1and X2 i.e. Correlation between Xi and Xjis given by joint moment
and covariance of Xi and Xj is
the n x n matrix with the elements µij is called the covariance matrix of the random variables Xi, i = 1,2,……..n
The two random variables are said to be uncorrelated if
E(XiXj) = E(Xi)E(Xj) = mimj
which means µij =0.
i.e. Xi and Xj are statistically independent then they are uncorrelated but vice versa is not true.
Two random variables are said to be orthogonal is
E(XiXj) = 0
this condition holds when Xi and Xj are uncorrelated and either one or both of the random variables have zero mean
The equivalent Matlab functions so far:
Post a Comment