Papers
Topics
Authors
Recent
Search
2000 character limit reached

DIM: Enforcing Domain-Informed Monotonicity in Deep Neural Networks

Published 25 Sep 2025 in cs.LG and cs.AI | (2509.21666v1)

Abstract: While deep learning models excel at predictive tasks, they often overfit due to their complex structure and large number of parameters, causing them to memorize training data, including noise, rather than learn patterns that generalize to new data. To tackle this challenge, this paper proposes a new regularization method, i.e., Enforcing Domain-Informed Monotonicity in Deep Neural Networks (DIM), which maintains domain-informed monotonic relationships in complex deep learning models to further improve predictions. Specifically, our method enforces monotonicity by penalizing violations relative to a linear baseline, effectively encouraging the model to follow expected trends while preserving its predictive power. We formalize this approach through a comprehensive mathematical framework that establishes a linear reference, measures deviations from monotonic behavior, and integrates these measurements into the training objective. We test and validate the proposed methodology using a real-world ridesourcing dataset from Chicago and a synthetically created dataset. Experiments across various neural network architectures show that even modest monotonicity constraints consistently enhance model performance. DIM enhances the predictive performance of deep neural networks by applying domain-informed monotonicity constraints to regularize model behavior and mitigate overfitting

Authors (3)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.