Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Neural Networks Are Congestion Games: From Loss Landscape to Wardrop Equilibrium and Beyond (2010.11024v1)

Published 21 Oct 2020 in cs.LG, cs.GT, and stat.ML

Abstract: The theoretical analysis of deep neural networks (DNN) is arguably among the most challenging research directions in ML right now, as it requires from scientists to lay novel statistical learning foundations to explain their behaviour in practice. While some success has been achieved recently in this endeavour, the question on whether DNNs can be analyzed using the tools from other scientific fields outside the ML community has not received the attention it may well have deserved. In this paper, we explore the interplay between DNNs and game theory (GT), and show how one can benefit from the classic readily available results from the latter when analyzing the former. In particular, we consider the widely studied class of congestion games, and illustrate their intrinsic relatedness to both linear and non-linear DNNs and to the properties of their loss surface. Beyond retrieving the state-of-the-art results from the literature, we argue that our work provides a very promising novel tool for analyzing the DNNs and support this claim by proposing concrete open problems that can advance significantly our understanding of DNNs when solved.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Nina Vesseron (4 papers)
  2. Ievgen Redko (28 papers)
  3. Charlotte Laclau (18 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.