Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Neural Networks for Rank-Consistent Ordinal Regression Based On Conditional Probabilities (2111.08851v5)

Published 17 Nov 2021 in cs.LG, cs.CV, and stat.ML

Abstract: In recent times, deep neural networks achieved outstanding predictive performance on various classification and pattern recognition tasks. However, many real-world prediction problems have ordinal response variables, and this ordering information is ignored by conventional classification losses such as the multi-category cross-entropy. Ordinal regression methods for deep neural networks address this. One such method is the CORAL method, which is based on an earlier binary label extension framework and achieves rank consistency among its output layer tasks by imposing a weight-sharing constraint. However, while earlier experiments showed that CORAL's rank consistency is beneficial for performance, it is limited by a weight-sharing constraint in a neural network's fully connected output layer, which may restrict the expressiveness and capacity of a network trained using CORAL. We propose a new method for rank-consistent ordinal regression without this limitation. Our rank-consistent ordinal regression framework (CORN) achieves rank consistency by a novel training scheme. This training scheme uses conditional training sets to obtain the unconditional rank probabilities through applying the chain rule for conditional probability distributions. Experiments on various datasets demonstrate the efficacy of the proposed method to utilize the ordinal target information, and the absence of the weight-sharing restriction improves the performance substantially compared to the CORAL reference approach. Additionally, the suggested CORN method is not tied to any specific architecture and can be utilized with any deep neural network classifier to train it for ordinal regression tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. Rank consistent ordinal regression for neural networks with application to age estimation. Pattern Recognition Letters, 140:325–331, 2020.
  2. W. Chu and S. S. Keerthi. New approaches to support vector ordinal regression. In Proceedings of the International Conference on Machine Learning, pages 145–152. ACM, 2005.
  3. K. Crammer and Y. Singer. Pranking with ranking. In Advances in Neural Information Processing Systems, pages 641–647, 2002.
  4. Imagenet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition, pages 248–255. IEEE, 2009.
  5. R. Diaz and A. Marathe. Soft labels for ordinal regression. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 4738–4747, 2019.
  6. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 770–778, 2016.
  7. T. Joachims. Optimizing search engines using clickthrough data. In Proceedings of the eighth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 133–142, 2002.
  8. D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. In Y. Bengio and Y. LeCun, editors, International Conference on Learning Representations, pages 1–8, 2015.
  9. L. Li and H.-T. Lin. Ordinal regression by extended binary classification. In Advances in Neural Information Processing Systems, pages 865–872, 2007.
  10. A constrained deep neural network for ordinal regression. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 831–839, 2018.
  11. I. Loshchilov and F. Hutter. Decoupled weight decay regularization. In International Conference on Learning Representations (Poster), 2019.
  12. Rectifier nonlinearities improve neural network acoustic models. In Proc. icml, volume 30, page 3. Citeseer, 2013.
  13. P. McCullagh. Regression models for ordinal data. Journal of the Royal Statistical Society. Series B (Methodological), pages 109–142, 1980.
  14. Ordinal regression with multiple output CNN for age estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 4920–4928, 2016.
  15. PyTorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, pages 8024–8035, 2019.
  16. Differentiable sorting networks for scalable sorting and ranking supervision. In International Conference on Machine Learning, 2021.
  17. Classification approach towards ranking and sorting problems. In Proceedings of the European Conference on Machine Learning, pages 301–312. Springer, 2003.
  18. S. Raschka. MLxtend: Providing machine learning and data science utilities and extensions to Python’s scientific computing stack. The Journal of Open Source Software, 3(24):1–2, 2018.
  19. K. Ricanek and T. Tesafaye. Morph: A longitudinal image database of normal adult age-progression. In Proceedings of the IEEE Conference on Automatic Face and Gesture Recognition, pages 341–345, 2006.
  20. 300 faces in-the-wild challenge: database and results. Image and Vision Computing, 47:3–18, 2016.
  21. An image is worth more than a thousand favorites: Surfacing the hidden beauty of flickr pictures. In International AAAI Conference on Web and Social Media, 2015.
  22. Ranking with large margin principle: Two approaches. Advances in Neural Information Processing Systems, pages 961–968, 2003.
  23. K. Simonyan and A. Zisserman. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
  24. Ordinal regression with explainable distance metric learning based on ordered sequences. Machine Learning, pages 1–34, 2021.
  25. Convolutional ordinal regression forest for image ordinal estimation. IEEE Transactions on Neural Networks and Learning Systems, 2021.
Citations (35)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com