Waiting times and stopping probabilities for patterns in Markov chains
Abstract
Suppose that $\mathcal C$ is a finite collection of patterns. Observe a Markov chain until one of the patterns in $\mathcal C$ occurs as a run. This time is denoted by $\tau$. In this paper, we aim to give an easy way to calculate the mean waiting time $E(\tau)$ and the stopping probabilities $P(\tau=\tau_A)$ with $A\in\mathcal C$, where $\tau_A$ is the waiting time until the pattern $A$ appears as a run.
- Publication:
-
arXiv e-prints
- Pub Date:
- February 2016
- DOI:
- 10.48550/arXiv.1602.06512
- arXiv:
- arXiv:1602.06512
- Bibcode:
- 2016arXiv160206512Z
- Keywords:
-
- Mathematics - Probability;
- 60J10;
- 60J22
- E-Print:
- 13 pages