Matrix-valued generating functions

Arrangements, combinations and permutations, probability, ...
Post Reply
pjt33
Posts: 99
Joined: Mon Oct 06, 2008 6:14 pm

Matrix-valued generating functions

Post by pjt33 »

On another site, I recently tackled a question asked about an "idle" computer game which boiled down to the expected time to reach the absorbing state in a Markov process. As I didn't know the standard approach (I have now learnt it, so don't feel obliged to explain it :wink:), I reasoned as follows: let $P$ be the standard Markov transition probability matrix such that $P^t [1\, 0\, 0\, \cdots\, 0]^T$ gives the probabilities of being in each state after $t$ steps, where the initial state is indexed as $1$ and the absorbing state as $N$. We can modify the matrix to $P'$ where $P'_{i,j} = [i \neq N] P_{i,j}$, so that we only spend one step in the absorbing state and then vanish from the system. Then $(P'^t [1\, 0\, 0\, \cdots\, 0]^T)_{N,1}$ gives the probability of reaching the absorbing state after exactly $t$ steps, so the expected hitting time can be found from $$(P' + 2P'^2 + 3P'^3 + 4P'^4 + \cdots) [1\, 0\, 0\, \cdots\, 0]^T$$ Being more familiar with generating functions than Markov matrices, I recognised a familiar friend. $$x + 2x^2 + 3x^3 + \cdots = x(1 - x)^{-2}$$ Unfortunately, this doesn't always work with $x = P'$. Curiously, though, the slight tweak $$1 + 2x + 3x^2 + \cdots = (1 - x)^{-2}$$ does seemed to work. Obviously it's necessary to subtract 1 to compensate for the increment in the weight given to each term.

I suspect that the failure of $P'(1-P')^{-2}$ (for some $P'$ - I've tried other Markov transition matrices for which it did work) is down to the non-commutativity of matrix multiplication. Does anyone have anything to say about the conditions in which standard g.f. tricks work when applied to matrices?
Post Reply