Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Survival Regression with Proper Scoring Rules and Monotonic Neural Networks (2103.14755v2)

Published 26 Mar 2021 in stat.ML and cs.LG

Abstract: We consider frequently used scoring rules for right-censored survival regression models such as time-dependent concordance, survival-CRPS, integrated Brier score and integrated binomial log-likelihood, and prove that neither of them is a proper scoring rule. This means that the true survival distribution may be scored worse than incorrect distributions, leading to inaccurate estimation. We prove that, in contrast to these scores, the right-censored log-likelihood is a proper scoring rule, i.e., the highest expected score is achieved by the true distribution. Despite this, modern feed-forward neural-network-based survival regression models are unable to train and validate directly on the right-censored log-likelihood, due to its intractability, and resort to the aforementioned alternatives, i.e., non-proper scoring rules. We therefore propose a simple novel survival regression method capable of directly optimizing log-likelihood using a monotonic restriction on the time-dependent weights, coined SurvivalMonotonic-net (SuMo-net). SuMo-net achieves state-of-the-art log-likelihood scores across several datasets with 20--100$\times$ computational speedup on inference over existing state-of-the-art neural methods, and is readily applicable to datasets with several million observations.

Citations (28)

Summary

We haven't generated a summary for this paper yet.