There are recaps from last class for the first few mins. Nothing new.

I think I made mistakes copying down equations. Help me correct them.

# Definition on Gamma distribution

The r.v. $X$ has a Gamma Distribution with parameters $\alpha$ and $\beta$ if its p.d.f is

$f_X(x) = \frac {1} {\Gamma(\alpha)} \frac {x^{\alpha -1}} {\beta^\alpha} e^{- \frac x \beta} I(x) \forall x \in (0,\infty)$

# Notes

1. A gamma pdf has its “support” on the positive reals. We say $X$ is a non-negative r.v. on this case.
2. We write $X \sim G(\alpha, \beta)$ (he’s not sure about this notation, you may want to check).
3. A Gamma pdf can take on different shapes, depending on the parameters $\alpha$ and $\beta$ [$\alpha$ is called the shape parameter and $\beta$ the scale parameter. ]

We say that the Gamma Distribution gives a “flexible” model because of the diefferent shapes that can arise.

4. The cdf is not expressible in closed form, it is

$F_X(x) = \int\limits_{0}^{x} \frac {1} {\Gamma(\alpha)} \frac {1} {\beta x} y^{\alpha -1} e^{ - \frac y \beta } dy$

This means that Gamma probabilities (apart) from a special case) have to be computed by numerical integration.

There are two special cases:

1. Gamma Distribution that are important
2. The exponential distribution.

Definition

If on the Gamma distribution, we set $\alpha = 1$, then pdf becomes

$f_X(x) = \frac {1} {\beta} e^{-x/\beta} I(x) \quad \forall x \in (0,\infty)$

We say that $X$ has an exponential distribution with parameter $\beta$, $X\sim exp(\beta)$

Sometimes the exponential pdf is written in the form, $f_X(x) = \lambda e ^{-\lambda x} I(x) \quad \forall x \in (0,\infty)$ where $\lambda = \frac {1} {\beta}$

Interesting property of the exponential distribution:

# The Memoryless Property

If $X \sim exp(\beta)$ then $P(x \leq X \leq x+h | X \geq x) = P(0\leq X < h)$

Note that the memoryless property does not assert that $P(x \leq X < x+h) = P(0\leq X <h)$

# Proof

FIrst, the cdf of an $exp(\beta)$ rv is

$\fn_jvn F_X(x) = \left \{ \begin{matrix} 0 & \quad \text{for } x \leq 0 \\ \int\limits_0^x \frac 1 \beta e^{-y/\beta} dy = 1-e^{-x/\beta}& \quad \text{for } x >0 \end{matrix} \right$

So

\begin{aligned} P(X>x) &= e^{-x/\beta} & \text{for } x>0 \ &= 1 &\text{for } x\leq 0 \end{aligned}

\begin{aligned} P(x \leq X < x+h | X \geq x) &= \frac {F_X(x+h) - F_X(x) } {1-F_X(x)} \ &= \frac {1-e^{-(x+h)/\beta} - [1-e^{-x/\beta}]} {e^{-x/\beta}} \ &= 1 - e^{h/\beta} \ &= P (0 \leq X < h) \end{aligned}

It turns out that the exponential distribution is the unique continous distribution with the memoryless property.

1. The exponential distribution is useful for modeling the times between events if the events occur according to what is known as a stationary Poisson process.

Essencially (think of alpha particles arriving at a counter).

Suppose taht the number of particles thatarrive at [0,t] has a Poisson distribution for all t, then the interarrival time of the particles must have an exponential distribution.

A second important special case of the Gamma distribution is when we set $\alpha = \mu/2$ and $\beta = 2$, i.e. the pdf becomes

$\frac 1 {\Gamma (\mu/2)}$

can’t see.

We say, that $X$ has a chi-square distribution with $\mu$ (what’s this char here?) degree of freedom

# The normal distribution

This is the most important distribution in probability and statistic.

## Definition

A r.v. $X$ has a normal distribution with parameters $\mu$ and $\sigma^2$

$f_X(x) = \frac 1 {\sqrt {2 \pi}} \frac 1 \sigma e^{\frac 1 2 (\frac {x-\mu} \sigma)^2} I(x) \quad \forall x \in (-\infty, \infty)$

## Note

1. The parameters $\mu$ and $\sigma^2$ turn out to correspond to “mean” and “variance” respectively (to be ssen).

The pdf has the familiar bell shape.