Mertens-stable equilibrium
Mertens stability is a solution concept used to predict the outcome of a non-cooperative game. A tentative definition of stability was proposed by Elon Kohlberg and Jean-François Mertens[1] for games with finite numbers of players and strategies. Later, Mertens[2] proposed a stronger definition that was elaborated further by Srihari Govindan and Mertens.[3] This solution concept is now called Mertens stability, or just stability.
Like other refinements of Nash equilibrium[4] used in game theory stability selects subsets of the set of Nash equilibria that have desirable properties. Stability invokes stronger criteria than other refinements, and thereby ensures that more desirable properties are satisfied.
Contents
Desirable Properties of a Refinement[edit]
Refinements have often been motivated by arguments for admissibility, backward induction, and forward induction. In a two-player game, an admissible decision rule for a player is one that does not use any strategy that is weakly dominated by another (see Strategic dominance). Backward induction posits that a player's optimal action in any event anticipates that his and others' subsequent actions are optimal. The refinement called subgame perfect equilibrium implements a weak version of backward induction, and increasingly stronger versions are sequential equilibrium, perfect equilibrium, quasi-perfect equilibrium, and proper equilibrium. Forward induction posits that a player's optimal action in any event presumes the optimality of others' past actions whenever that is consistent with his observations. Forward induction[5] is satisfied by a sequential equilibrium for which a player's belief at an information set assigns probability only to others' optimal strategies that enable that information to be reached.
Kohlberg and Mertens emphasized further that a solution concept should satisfy the invariance principle that it not depend on which among the many equivalent representations of the strategic situation as an extensive-form game is used. Thus it should depend only on the reduced normal-form game obtained after elimination of pure strategies that are redundant because their payoffs for all players can be replicated by a mixture of other pure strategies. Mertens[6][7] emphasized also the importance of the small worlds principle that a solution concept should depend only on the ordinal properties of players' preferences, and should not depend on whether the game includes extraneous players whose actions have no effect on the original players' feasible strategies and payoffs.
Kohlberg and Mertens demonstrated via examples that not all of these properties can be obtained from a solution concept that selects single Nash equilibria. Therefore, they proposed that a solution concept should select closed connected subsets of the set of Nash equilibria.[8]
Properties of Stable Sets[edit]
- Admissibility and Perfection: Each equilibrium in a stable set is perfect, and therefore admissible.
- Backward Induction and Forward Induction: A stable set includes a proper equilibrium of the normal form of the game that induces a quasi-perfect and therefore a sequential equilibrium in every extensive-form game with perfect recall that has the same normal form. A subset of a stable set survives iterative elimination of weakly dominated strategies and strategies that are inferior replies at every equilibrium in the set.
- Invariance and Small Worlds: The stable sets of a game are the projections of the stable sets of any larger game in which it is embedded while preserving the original players' feasible strategies and payoffs.[9]
- Decomposition and Player Splitting. The stable sets of the product of two independent games are the products of their stable sets. Stable sets are not affected by splitting a player into agents such that no path through the game tree includes actions of two agents.
For two-player games with perfect recall and generic payoffs, stability is equivalent to just three of these properties: a stable set uses only undominated strategies, includes a quasi-perfect equilibrium, and is immune to embedding in a larger game.[10]
Definition of a Stable Set[edit]
A stable set is defined mathematically by essentiality of the projection map from a closed connected neighborhood in the graph of the Nash equilibria over the space of perturbed games obtained by perturbing players' strategies toward completely mixed strategies. This definition requires more than every nearby game having a nearby equilibrium. Essentiality requires further that no deformation of the projection maps to the boundary, which ensures that perturbations of the fixed point problem defining Nash equilibria have nearby solutions. This is apparently necessary to obtain all the desirable properties listed above.
Mertens provided several formal definitions depending on the coefficient module used for homology or cohomology.
A formal definition requires some notation. For a given game let be product of the simplices of the players' of mixed strategies. For each , let and let be its topological boundary. For let be the minimum probability of any pure strategy. For any define the perturbed game as the game where the strategy set of each player is the same as in , but where the payoff from a strategy profile is the payoff in from the profile . Say that is a perturbed equilibrium of if is an equilibrium of . Let be the graph of the perturbed equilibrium correspondence over , viz., the graph is the set of those pairs such that is a perturbed equilibrium of . For , is the corresponding equilibrium of . Denote by the natural projection map from to . For , let , and for let . Finally, refers to Čech cohomology with integer coefficients.
The following is a version of the most inclusive of Mertens' definitions, called *-stability.
Definition of a *-stable set: is a *-stable set if for some closed subset of with it has the following two properties:
- Connectedness: For every neighborhood of in , the set has a connected component whose closure is a neighborhood of in .
- Cohomological Essentiality: is nonzero for some .
If essentiality in cohomomology or homology is relaxed to homotopy then a weaker definition is obtained, which differs chiefly in a weaker form of the decomposition property.[11]
References[edit]
- ^ Kohlberg, Elon, and Jean-François Mertens (1986). "On the Strategic Stability of Equilibria" (PDF). Econometrica. 54 (5): 1003–1037. CiteSeerX 10.1.1.295.4592. doi:10.2307/1912320. JSTOR 1912320.CS1 maint: Uses authors parameter (link)
- ^ Mertens, Jean-François, 1989, and 1991. "Stable Equilibria - A Reformulation," Mathematics of Operations Research, 14: 575-625 and 16: 694-753. [1]
- ^ Govindan, Srihari, and Jean-François Mertens, 2004. "An Equivalent Definition of Stable Equilibria," International Journal of Game Theory, 32(3): 339-357. [2] [3]
- ^ Govindan, Srihari & Robert Wilson, 2008. "Refinements of Nash Equilibrium," The New Palgrave Dictionary of Economics, 2nd edition. "Archived copy" (PDF). Archived from the original (PDF) on 2010-06-20. Retrieved 2012-02-12.CS1 maint: Archived copy as title (link)
- ^ Govindan, Srihari, and Robert Wilson, 2009. "On Forward Induction," Econometrica, 77(1): 1-28. [4] [5]
- ^ Mertens, Jean-François, 2003. "Ordinality in Non Cooperative Games," International Journal of Game Theory, 32: 387–430. [6]
- ^ Mertens, Jean-François, 1992. "The Small Worlds Axiom for Stable Equilibria," Games and Economic Behavior, 4: 553-564. [7]
- ^ The requirement that the set is connected excludes the trivial refinement that selects all equilibria. If only a single (possibly unconnected) subset is selected then only the trivial refinement satisfies the conditions invoked by H. Norde, J. Potters, H. Reijnierse, and D. Vermeulen (1996): ``Equilibrium Selection and Consistency, Games and Economic Behavior, 12: 219-225.
- ^ See Appendix D of Govindan, Srihari, and Robert Wilson, 2012. "Axiomatic Theory of Equilibrium Selection for Generic Two-Player Games," Econometrica, 70. [8]
- ^ Govindan, Srihari, and Robert Wilson, 2012. "Axiomatic Theory of Equilibrium Selection for Generic Two-Player Games," Econometrica, 70. [9]
- ^ Srihari Govindan and Robert Wilson, 2008. "Metastable Equilibria," Mathematics of Operations Research, 33: 787-820.