Random Variables

Boltzmann on 9th Aug, 2009

Random Variables (r.v.)

Let $X$ be a map from $\Omega$ (‘the sample space’) to the real line.

(1)
\begin{eqnarray} X: \Omega &\rightarrow& \mathbb{R} \\ \zeta &\mapsto& X(\zeta) \end{eqnarray}

We say that $X$ is a random variable (r.v.) when the sample point $\zeta$ is picked at random.

Ex: Let $\Omega$ be set of all the students the class. Then $X(\zeta) \to$ height of the students is a r.v.

Functions of r.v.'s are also r.v..

Notation

$[X = x]$ refers to the set of all $\zeta$ such that $X(\zeta) = x, \ x \in \mathbb{R}$

Probability mass function :

Consider a discrete sample space $\Omega$. Let $X$ be a R.V on $\Omega$.

Then the function

(2)
\begin{eqnarray} P[X = x_i] = p_i, \ i = 1,2,3 \ldots \end{eqnarray}

is called the probability mass function of the r.v. Each $p_j \geq 0$ and $\sum_j p_j = 1$

If two R.Vs X and Y are defined on the same sample space $\Omega$ then their "joint" probability distribution is

(3)
\begin{eqnarray} P[X = x_j ; Y = y_k] = p_{j,k} \end{eqnarray}

with $p_{j,k} >= 0$ and $\sum_j \sum_k p_{j,k}(x,y)=1$. The partial sum $\sum_k p_{j,k}$ is called the marginal distribution of X.

Cumulative Distribution Function (cdf)

If $X$ is a r.v. defined on the sample space $\Omega$, then

(4)
\begin{eqnarray} F_x(x) = P[X \leq x] \end{eqnarray}

is called the cumulative distribution function of $X$.

Properties of the cdf $F(x)$

  1. $F(x)$ is a non-decreasing function.
  2. It is right continuous i.e., $\lim_{h \rightarrow 0^+} F(x+h) = F(x)$
  3. $F(-\infty) = 0$.
  4. $F(\infty) = 1$.

Probability Distribution Function (pdf)

For a continuous r.v.,

(5)
\begin{align} F'(x) \doteq \frac{dF(x)}{dx} \ = f(x) \end{align}

is called the probability distribution function (pdf).

(6)
\begin{align} f(x) = \lim_{\Delta x \rightarrow 0} \frac{P[x < X \leq x+\Delta x]}{\Delta x} \end{align}

Also,

(7)
\begin{align} P[x_1 \leq X \leq x_2] = F(x_2) - F(x_1) = \int_{x_1}^{x_2} f(x)dx \end{align}

So, $P[X=x] = 0$. One also as

(8)
\begin{align} F_x(x) = \int_{-\infty}^{x} f(u) du \end{align}

Mean value

For a discrete r.v. $X$ on a sample space $\Omega$, the mean value is defined as follows:

(9)
\begin{eqnarray} \langle X \rangle = \sum_{\zeta \in \Omega} X(\zeta) P(\{ \zeta \}) \end{eqnarray}

provided

(10)
\begin{eqnarray} \sum_{\zeta \in \Omega} |X(\zeta)| P(\{ \zeta\}) < \infty \end{eqnarray}

If we see $P(\{ \zeta\})$ as weight attached to $\zeta$ then, $\langle X\rangle$ is the weighted average of the function $X$.

For a continuous r.v. the definition is

(11)
\begin{eqnarray} \langle X \rangle = \int_{-\infty}^{\infty} x f(x) dx \end{eqnarray}

provided

(12)
\begin{eqnarray} \int_{-\infty}^{\infty} |x| f(x) dx < \infty \end{eqnarray}

$\langle X \rangle$ is also known as mean value or ensemble average.

If $\langle X \rangle =0$ then X is said to be Centered Random Variable.

In general, the expectation of $g(X)$ is,

(13)
\begin{align} \int_{-\infty}^{\infty} g(x) f(x) dx \end{align}

provided the integral converges absolutely.

Variance

Variance $\sigma^2$ is defined as follows,

(14)
\begin{align} \sigma^2 = \langle (X - \langle X\rangle)^2 \rangle = \langle X^2 \rangle - \langle X \rangle ^2 \end{align}

For a centred R.V,

(15)
\begin{eqnarray} \sigma^2 = \langle X^2 \rangle \end{eqnarray}

Probability Generating Functions (pgf)

Let $X$ be a R.V taking only non-negative integer values with the probability distribution given by

(16)
\begin{eqnarray} P[X=j] = a_j , \ j=1,2,3, \ldots \end{eqnarray}

Consider the power series

(17)
\begin{eqnarray} g(z)= a_0+a_1z+a_2z^2+..... = \sum_{j=0}^{\infty}a_jz^j \end{eqnarray}

For $|z| \leq1$ the series $g(z)$ is dominated and hence converges.

Consider,

(18)
\begin{eqnarray} g'(z) = a_1+2a_2z+\ldots=\sum_{n=1}^{\infty}na_nz^{n-1} \end{eqnarray}
(19)
\begin{eqnarray} g^{(j)}(z) = \sum_{n=j}^{\infty}n(n-1)\ldots (n-j+1)a_nz^{n-j} \end{eqnarray}

Put $z=0$, we get the probabilities

(20)
\begin{eqnarray} a_j = \frac{1}{j!}g^{(j)}(0) \end{eqnarray}

Put $z =1$,

(21)
\begin{eqnarray} g'(1)= \sum_{n=0}^{\infty}na_n =\langle X \rangle \end{eqnarray}
(22)
\begin{eqnarray} g''(1) = \langle X^2 \rangle - \langle X \rangle \end{eqnarray}

etc.,

Theorem 1:
The probability distribution of a non-negative integer valued random variable is uniquely determined by its generating function.

Theorem 2:
If the random variables $X_1, X_2 \ldots X_n$ are independent and have $g_1, g_2 \ldots g_n$ as their generating function, then the generating function of the sum $X_1+X_2+\ldots+X_n$ is product $g_1g_2\ldots g_n$$.

Let us define,

(23)
\begin{eqnarray} g(z) = \langle z^X \rangle \end{eqnarray}

where X is a R.V on the sample space $\Omega$

Moment generating functions (MGF)

The moment generating function of a R.V X is defined as,

(24)
\begin{eqnarray} M_x(t) = \langle e^{tx}\rangle \end{eqnarray}

If such an $M_x(t)$ exists, then

(25)
\begin{eqnarray} \frac{dM_x(t)}{dt} = \langle e^{tx}x \rangle = \langle X \rangle \end{eqnarray}
(26)
\begin{eqnarray} \frac{d^2M_x(t)}{dt^2} = \langle e^{tx} x \rangle = \langle X^2 \rangle \end{eqnarray}

Laplace Transform

Let $0 \leq z \leq 1$.

(27)
\begin{eqnarray} z=e^{-\lambda} , \ 0 \leq \lambda < \infty \end{eqnarray}

For a discrete R.V $X$ ,

(28)
\begin{eqnarray} \langle e^{-\lambda x} \rangle = \sum_{j}p_je^{-\lambda x_j} \end{eqnarray}

provided the series converge absolutely.

For a continuous non-negative R.V $X$ with density function $f_x$,

(29)
\begin{eqnarray} \langle e^{-\lambda x}\rangle = \int_{-\infty}^{\infty} e^{-\lambda u} f_x(u) du \end{eqnarray}

provided the integral converges absolutely, which happens only when X is non-negative.

Eg: Gamma variable.

(30)
\begin{eqnarray} f(x) = \frac{\lambda}{\Gamma(r)}(\lambda x)^{r-1} e^{-\lambda x}, \ x >0 \end{eqnarray}
(31)
\begin{eqnarray} \langle e^{tx} \rangle = \Big(1-\frac{t}{\lambda}\Big)^{-r} \end{eqnarray}

Characteristic Function

Suppose, $z=e^{i\theta}$ then the characteristic function of the R.V $X$ with pdf $f_x$ is given by

(32)
\begin{eqnarray} \langle e^{i\theta x}\rangle = \int_{-\infty}^{\infty}e^{i\theta x}f_x(u) du \end{eqnarray}

which is nothing but the Fourier Transform of $f_x$.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License