Mean Field Game of Controls with State Reflections: Existence and Limit Theory (2503.03253v1)
Abstract: This paper studies mean field game (MFG) of controls by featuring the joint distribution of the state and the control with the reflected state process along an exogenous stochastic reflection boundary. We contribute to the literature with a customized relaxed formulation and some new compactification arguments to establish the existence of a Markovian mean field equilibrium (MFE) in the weak sense. We consider an enlarged canonical space, utilizing the dynamic Skorokhod mapping, to accommodate the stochastic reflection boundary process. A fixed-point argument on the extended space using an extension transformation technique is developed to tackle challenges from the joint measure flow of the state and the relaxed control that may not be continuous. Furthermore, the bidirectional connections between the MFG and the $N$-player game are also established in the context of joint law dependence and state reflections. We first show that any weak limit of empirical measures induced by $\boldsymbol{\epsilon}$-Nash equilibria in $N$-player games must be supported exclusively on the set of relaxed mean field equilibria, analogous to the propagation of chaos in mean field control problems. We then prove the convergence result that a Markovian MFE in the weak sense can be approximated by a sequence of constructed $\boldsymbol{\epsilon}$-Nash equilibria in the weak sense in $N$-player games when $N$ tends to infinity.