A Fast Transformation of Markov Chains and Their Respective Steady-State Probability Distributions

Vallejos, RA; Martinez, JM

Abstract

In this paper, a new method for evaluating the steady-state distribution of an ergodic, discrete or continuous time, Markov chain (MC) is proposed. The method comprises three steps: (i) transforming the original MC by modifying its transitions; (ii) finding the steady-state solution of the transformed MC using any of the existing methods and (because the transformed MC does not necessarily have the same steady-state solution as the original MC) and (iii) converting the transformed MC's solution into the solution of the original chain. The transformation of step 1 has a time cost proportional to the number of transitions in the original chain, and the conversion of the solution in step 3 has a time cost proportional to the number of states of the chain. Consequently, steps 1 and 3 are executable with a negligible computational cost in comparison to step 2. The contribution of this work is fundamentally conceptual. However, possible implications of the proposed transformation are also presented through examples. Some of these eventual applications are the use of the proposed transformation in conjunction with the iterative Power method; the deduction of known MC transformations as special cases of the new transformation and the definition of MC families such that the steady-state distribution of a given MC can be efficiently transformed in the distribution of another MC of the same family.

Más información

Título según WOS: A Fast Transformation of Markov Chains and Their Respective Steady-State Probability Distributions
Título de la Revista: COMPUTER JOURNAL
Volumen: 57
Número: 1
Editorial: OXFORD UNIV PRESS
Fecha de publicación: 2014
Página de inicio: 1
Página final: 11
Idioma: English
URL: http://comjnl.oxfordjournals.org/cgi/doi/10.1093/comjnl/bxs086
DOI:

10.1093/comjnl/bxs086

Notas: ISI