STAT 400

Wed. March 11th, 2020


Exponential Distribution and Poisson Process cont.

Ex. Suppose 0=T1T2T3...0=T_1\le T_2\le T_3\le... are the arrival times of the iith bus (i=1,2,3...i=1,2,3...), and Ti+1TiT_{i+1}-T_i follows exp(α)\text{exp}(\alpha) for all ii.

  1. What is the probability that the first bus arrives after a time tt?
    P(T1>t)=1P(T1t)=1(1eαt)=eαt.P(T_1> t)=1-P(T_1\le t)=1-(1-e^{-\alpha t})=e^{-\alpha t}.
  2. If you arrive at the bus stop at time 0, what is the expected waiting time to get on a bus?
    E(T1)=1α.E(T_1)=\frac{1}{\alpha}.
  3. Suppose you’ve already waited for t0t_0 minutes. What is the probability that you’ll need to wait at least another tt minutes?
    P(T1t+t0T1>t0)=P({T1t+t0}{T1t0})P(T1t0)=P(T1t+t0)P(T1t0)=eα(t+t0)eαt0=eαt=P(T1t).\begin{aligned} P(T_1\ge t+t_0\mid T_1>t_0)&=\frac{P(\{T_1\ge t+t_0\}\cap \{T_1\ge t_0\})}{P(T_1\ge t_0)}\\ &=\frac{P(T_1\ge t+t_0)}{P(T_1\ge t_0)}\\ &=\frac{e^{-\alpha(t+t_0)}}{e^{-\alpha t_0}}\\ &=e^{-\alpha t}=P(T_1\ge t). \end{aligned} We’re able to get rid of the intersect here because P(T1t+t0)P(T_1\ge t+t_0) is a subset of P(T1t0)P(T_1\ge t_0) – the left will always be true when the right is (but not the other way around).

Note the fact that the value of t0t_0 is irrelevant to our final result. This is called the memoryless property of exp(α)\text{exp}(\alpha) – what has happened in the past will not affect the future results for this distribution. In other words, P(Xt+t0Xt0)=P(Xt)P(X\ge t+t_0 \mid X\ge t_0)=P(X\ge t) for any t0>0t_0>0 when XX follows exp(α)\text{exp}(\alpha).


Gamma Distribution

Consider the same scenario as the last section. Suppose you arrive at the bus stop at t0t_0. What’s the probability that you midd the nnth bus?
In other words, what is P(Tnt0)P(T_n\le t_0)? To answer this, we need to introduce a new distribution.

Gamma Function

Definition. The Gamma function Γ(α)\Gamma(\alpha) is defined for α>0\alpha>0 as Γ(α)=0xα1exdx.\Gamma(\alpha)=\int_0^{\infin} x^{\alpha - 1}e^{-x}dx .

Properties of the Gamma Function

  1. When α=n\alpha=n and nn is a positive integer, then Γ(α)=(n1)!\Gamma(\alpha)=(n-1)!.
  2. Γ(α)=(α1)Γ(α1)\Gamma(\alpha)=(\alpha-1)\cdot \Gamma(\alpha-1) for all α>1\alpha>1.
  3. Γ(12)=π.\Gamma(\frac{1}{2})=\sqrt{\pi}.

Gamma Distribution Definition

Definition. We say a continuous RV XX follows the Gamma distribution with parameters α>0, β>0\alpha>0,\ \beta >0 if its pdf is f(x;α,β)={1βαΓ(α)xα1exβx00otherwisef(x;\alpha,\beta)=\begin{cases} \frac{1}{\beta^\alpha \Gamma(\alpha)}x^{\alpha-1}e^{-\frac{x}{\beta}} & x\ge 0\\ 0 & \text{otherwise} \end{cases} In this case, we say XX follows Γ(α,β)\Gamma(\alpha,\beta).

In the special case when β=1\beta=1, we call Gamma(α,1)\text{Gamma}(\alpha,1) the standard Gamma distribution.


Nonstandard Gamma Distributions

Proposition. If XX follows Gamma(α,β)\text{Gamma}(\alpha,\beta), then Xβ\frac{X}{\beta} follows Gamma(α,1)\text{Gamma}(\alpha,1).

Proof.
P(Xβx)=P(Xβx)=0βx1βαΓ(α)yα1eyβdy=0x1βαΓ(α)(βz)α1ezβdz=0x1Γ(α)zα1ezdz=P(Zx),\begin{aligned} P(\frac{X}{\beta}\le x)&=P(X\le \beta x)\\ &=\int_0^{\beta x}\frac{1}{\beta^\alpha\Gamma(\alpha)}y^{\alpha-1}e^{-\frac{y}{\beta}}dy\\ &=\int_0^{x}\frac{1}{\beta^\alpha\Gamma(\alpha)}(\beta z)^{\alpha-1}e^{-z}\beta dz\\ &=\int_0^{x}\frac{1}{\Gamma(\alpha)}z^{\alpha-1}e^{-z} dz\\ &=P(Z\le x), \end{aligned} where ZZ follows Gamma(α,1)\text{Gamma}(\alpha,1), so therefore Xβ=Z.\frac{X}{\beta}=Z.


After this he calculated the expected value and variance for the Gamma distribution but I wasn’t fast enough to type it; please look it up in the textbook.


CDF

When ZZ follows Gamma(α,1)\text{Gamma}(\alpha, 1), P(Zx)=F(x;α)=0xyα1eyΓ(α)dy.P(Z\le x)=F(x;\alpha)=\int_0^x \frac{y^{\alpha -1}e^{-y}}{\Gamma(\alpha)}dy. This equation has no closed form; F(x;α)F(x;\alpha) is called the incomplete Gamma function, and there is a table for it in the textbook.

When XX follows Gamma(α,β)\text{Gamma}(\alpha, \beta), P(Xx)=P(Xxβ)=F(xβ;α),P(X\le x)=P(X\le \frac{x}{\beta})=F(\frac{x}{\beta};\alpha), where FF is again the incomplete Gamma function.