2000 character limit reached
Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms (1805.09717v4)
Published 24 May 2018 in stat.ML and cs.LG
Abstract: This paper studies Fenchel-Young losses, a generic way to construct convex loss functions from a regularization function. We analyze their properties in depth, showing that they unify many well-known loss functions and allow to create useful new ones easily. Fenchel-Young losses constructed from a generalized entropy, including the Shannon and Tsallis entropies, induce predictive probability distributions. We formulate conditions for a generalized entropy to yield losses with a separation margin, and probability distributions with sparse support. Finally, we derive efficient algorithms, making Fenchel-Young losses appealing both in theory and practice.