Martingale Convergence

Last quarter, I did a DRP  on martingales, and at the start of this quarter, gave a short talk on what I learned. Here are the notes from that talk.

1. Conditional Expectation

Definition 1. Suppose {(\Omega, \mathcal{F}_o, \mathbb{P})} is a probability space, {\mathcal{F}\subset \mathcal{F}_o} is a sub {\sigma}-algebra, and {X} is a random variable.

  1. The conditional expectation of {X} given {\mathcal{F}}, {E(X\mid \mathcal{F})}, is any {\mathcal{F}}-measurable random variable {Y} such that

    \displaystyle \int_A X\, d\mathbb{P}=\int_A Y\, d\mathbb{P}\text{ for all } A\in \mathcal{F}.

  2. If {Y} is another random variable, then {E(X\mid Y)} is defined as {E(X\mid \mathcal{F})} where {\mathcal{F}} is the {\sigma}-algebra generated by {Y}.

Fact. The conditional expectation exists, is unique, and is integrable.

Intuitively, we can think of {E(X\mid \mathcal{F})} as the best guess of the value of {X} given the information available in {\mathcal{F}}.

Example 1.

  1. If {X} is {\mathcal{F}}-measurable, then {E(X\mid \mathcal{F})=X}.
  2. If {X} and {Y} are independent, then {E(X\mid Y)=E(X)}.
  3. Let {X_1,\cdots} be random variables with mean {\mu}, and let {S_n=X_1+\cdots+X_n} be the partial sums. Then, if {m>n},

    \displaystyle E(S_m\mid S_n)=E(X_m+\cdots+X_{n+1}+S_n\mid S_n)=E(X_m\mid S_n)+\cdots+E(X_{n+1}\mid S_n)+E(S_n\mid S_n)=\mu(m-n)+S_n.

    The idea is that you know your position at {S_n}, and you take {m-n} steps whose sizes are, on average, {\mu}, so your best guess for your position is {S_n+\mu(m-n)}.

2. Martingales

A martingale is a model of a fair game.

Definition 2. Consider a filtration (increasing sequence of {\sigma}-algebras) {\mathcal{F}_n} and a sequence of random variables {X_n}, each measurable with respect to {\mathcal{F}_n} and integrable. Then, if {E(X_{n+1}\mid \mathcal{F}_n)=X_n} for all {n}, we say {S_n} is a martingale.

Example 2. Let {X_1,\cdots} be random variables that take only the values {-1} and {1} with probability {1/2} each. Then {S_n=X_1+\cdots+ X_n} is a martingale, because

\displaystyle E(S_{n+1}\mid S_n)=E(X_{n+1}+S_n\mid S_n)=E(X_{n+1}\mid S_n)+E(S_n\mid S_n)=E(X_{n+1})+S_n=S_n.

Theorem 3. If {X_n} is a martingale with {\sup E(X_n^+)<\infty}, then {X_n} converges a.s. to a limit {X} with {E(|X|)<\infty}.

I’ll only sketch this proof, because even though the idea is nice, the details are a little annoying. The idea is to set up a way of betting on a martingale, show that you can’t make money off such a betting system, and then use this to draw conclusions about the martingales behavior.

Definition 4. Let {\mathcal{F}_n} be a filtration. A sequence {H_m} of random variables is said to be predictable if {H_m} is {\mathcal{F}_{m-1}} measurable for all {m}.

That is, the value of {H_m} can always be predicted with certainty at time {m}. Then, if we “bet” an amount {H_m} on the martingale {X_m}, our total winnings will be

\displaystyle (H\cdot X)_n=\sum_{m=1}^n H_m(X_m-X_{m-1}).

If {X_n} is a supermartingale and {H_n} is bounded, then {(H\cdot X)_n} is a supermartingale as well.

Now, fix an interval {(a,b)} and choose {H_n} in the following way: if {X_n<a}, then bet 1, and continue to do so until {X_n>b}. Then bet zero until you fall back below {a}, and repeat. Every time you go from below {a} to above {b}, you will make a profit of at least {(b-a)}. Let {U_n} be the number of these upcrossings that occur. Then, you can use the previous fact, together with the bound on {\sup E(X_n^+)}, to show that the number of these upcrossings is finite with probability 1. Since this is true for arbitrary {(a,b)}, the {X_n} must converge.

Finally, we give a brief application of the martingale convergence theorem.

Proposition 5. Let {X_n} be a sequence of random variables such that {P(X_n=1)=P(X_n=-1)=\frac{1}{2}.} Then {{\sum_{j=1}^{\infty} \frac{X_j}{j}}} converges with probability 1.

Proof: Let {{S_n=\sum_{j=1}^n\frac{X_j}{j}}.} It’s easy to check that {S_n} is a martingale, and that {E(S_n)} is bounded (in fact, it’s always 0), so the {S_n} converge almost surely. Thus the random harmonic series converges almost surely. \Box

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s