User:Prakash.balachandran
Definition[edit]
A Markov chain {Xn} on state space Ω with kernel K is a Harris chain[1] if there exists A, B ⊆ Ω, ϵ > 0, and probability measure ρ with ρ(B) = 1 such that
1. If τA := inf {n ≥ 0 : Xn ∈ A}, then P(τA < ∞|X0 = x) > 0 for all x ∈ Ω.
2. If x ∈ A and C ⊆ B then K(x, C ) ≥ ϵρ(C )
In essence, this technical definition can be rephrased as follows: given two points x1 and x2 in A, then there is at least an ϵ chance that they can be moved together to the same point at the next time step.
Another way to say it is that suppose that x and y are in A. Then at the next time step I first flip a Bernoulli with parameter ϵ. If it comes up one, I move the points to a point chosen using ρ. If it comes up zero, the points move independently, with x moving according to P(Xn+1∈ C |Xn = x) = K(x, C ) − ϵρ(C ) and y moving according to P(Yn+1 ∈ C |Yn = y) = K(y, C ) − ϵρ(C ).
Reducibility and Periodicity[edit]
In the following, R := inf {n ≥ 1 : Xn∈ A}; i.e. R is the first time after time 0 that the process enters region A.
Definition: If for all L(X0 ), P(R < ∞|X0 ∈ A) = 1, then call a Harris chain is recurrent.
Definition: A recurrent Harris chain Xn is aperiodic if ∃N, such that ∀n ≥ N, ∀L(X0 ), P(Xn ∈ A|X0 ∈ A) > 0.
Theorem: Let Xn be an aperiodic recurrent Harris chain with stationary distribution π. If P(R < ∞|X0 = x) then as n → ∞, distTV (L(Xn |X0 = x), π) → 0.
References[edit]
- ^ R. Durrett. Probability: Theory and Examples. Thomson, 2005. ISBN 0-534-42441-4.