Files in this item



application/pdf3069968.pdf (8MB)Restricted to U of Illinois
(no description provided)PDF


Title:Stochastically Stable States for Perturbed Repeated Play of Coordination Games
Author(s):Anderson, Mark Daniel
Doctoral Committee Chair(s):Muncaster, Robert G.
Department / Program:Mathematics
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):Economics, Theory
Abstract:Perturbed repeated play of a two-player two-move coordination game is modeled as an irreducible Markov process on a set of history states describing the most recent m stages of play. At every stage, each player draws a sample from the history state to forecast his opponent's behavior and either moves to maximize his single-stage expected payoff or commits an error. As the error rate approaches zero the stationary distributions converge to a stationary distribution for the unperturbed process. Stochastically stable states, which comprise the support of the limiting distribution, are identified by finding minimum-weight spanning trees of a weighted directed graph on the set of recurrent classes for the unperturbed process. When the sample size equals the memory length m, cycles may be present in the unperturbed process. Necessary and sufficient conditions for the existence of stochastically stable cycle states are provided. The stochastically stable states are identified, for sufficiently large sample sizes and memory lengths, as those states representing repeated play of a risk-dominant Nash equilibrium. Finally, five coordination conditions are introduced to characterize N-player coordination games.
Issue Date:2002
Description:201 p.
Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2002.
Other Identifier(s):(MiAaPQ)AAI3069968
Date Available in IDEALS:2015-09-28
Date Deposited:2002

This item appears in the following Collection(s)

Item Statistics