Papers
Topics
Authors
Recent
2000 character limit reached

On the instability and degeneracy of deep learning models

Published 4 Dec 2016 in math.ST and stat.TH | (1612.01159v3)

Abstract: A probability model exhibits instability if small changes in a data outcome result in large, and often unanticipated, changes in probability. This instability is a property of the probability model, given by a distributional form and a given configuration of parameters. For correlated data structures found in several application areas, there is increasing interest in identifying such sensitivity in model probability structure. We consider the problem of quantifying instability for general probability models defined on sequences of observations, where each sequence of length N has a finite number of possible values that can be taken at each point. A sequence of probability models results, indexed by N, and an associated parameter sequence, that accommodates data of expanding dimension. Model instability is formally shown to occur when a certain log-probability ratio under such models grows faster than N. In this case, a one component change in the data sequence can shift probability by orders of magnitude. Also, as instability becomes more extreme, the resulting probability models are shown to tend to degeneracy, placing all their probability on potentially small portions of the sample space. These results on instability apply to large classes of models commonly used in random graphs, network analysis, and machine learning contexts.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.