Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Admission Prediction in Undergraduate Applications: an Interpretable Deep Learning Approach (2401.11698v1)

Published 22 Jan 2024 in cs.LG and cs.AI

Abstract: This article addresses the challenge of validating the admission committee's decisions for undergraduate admissions. In recent years, the traditional review process has struggled to handle the overwhelmingly large amount of applicants' data. Moreover, this traditional assessment often leads to human bias, which might result in discrimination among applicants. Although classical machine learning-based approaches exist that aim to verify the quantitative assessment made by the application reviewers, these methods lack scalability and suffer from performance issues when a large volume of data is in place. In this context, we propose deep learning-based classifiers, namely Feed-Forward and Input Convex neural networks, which overcome the challenges faced by the existing methods. Furthermore, we give additional insights into our model by incorporating an interpretability module, namely LIME. Our training and test datasets comprise applicants' data with a wide range of variables and information. Our models achieve higher accuracy compared to the best-performing traditional machine learning-based approach by a considerable margin of 3.03\%. Additionally, we show the sensitivity of different features and their relative impacts on the overall admission decision using the LIME technique.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. A. Waters and R. Miikkulainen, “Grade: Machine-learning support for graduate admissions,” in AI Magazine, vol. 35, no. 1, 2014, pp. 64–75. [Online]. Available: www.aaai.org
  2. J. E. Gilbert, “Applications Quest: Computing Diversity,” Auburn University, Tech. Rep., 2006.
  3. Anonymous, “Anonymous,” in Anonymous.   IEEE, 2022, p. Anonymous.
  4. M. T. Ribeiro, S. Singh, and C. Guestrin, “”Why should i trust you?” Explaining the predictions of any classifier,” in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp. 1135–1144. [Online]. Available: http://dx.doi.org/10.1145/2939672.2939778
  5. T. Lux, R. Pittman, M. Shende, and A. Shende, “Applications of supervised learning techniques on undergraduate admissions data,” 2016 ACM International Conference on Computing Frontiers - Proceedings, pp. 412–417, may 2016. [Online]. Available: http://dx.doi.org/10.1145/2903150.2911717
  6. C. Andris, D. Cowen, and J. Wittenbach, “Support Vector Machine for Spatial Variation,” Transactions in GIS, vol. 17, no. 1, pp. 41–61, feb 2013.
  7. “University of california will no longer consider sat and act scores.” [Online]. Available: https://www.nytimes.com/2021/05/15/us/SAT-scores-uc-university-of-california.html
  8. “The death and life of an admissions algorithm.” [Online]. Available: https://www.insidehighered.com/admissions/article/2020/12/14/u-texas-will-stop-using-controversial-algorithm-evaluate-phd
  9. S. Staudaher, J. Lee, and F. Soleimani, “Predicting Applicant Admission Status for Georgia Tech’s Online Master’s in Analytics Program,” L@S 2020 - Proceedings of the 7th ACM Conference on Learning @ Scale, pp. 309–312, aug 2020. [Online]. Available: http://dx.doi.org/10.1145/3386527.3406735
  10. S. Sridhar, S. Mootha, and S. Kolagati, “A university admission prediction system using stacked ensemble learning,” in 2020 Advanced Computing and Communication Technologies for High Performance Applications (ACCTHPA).   IEEE, 2020, pp. 162–167.
  11. A. AlGhamdi, A. Barsheed, H. AlMshjary, and H. AlGhamdi, “A machine learning approach for graduate admission prediction,” in Proceedings of the 2020 2nd International Conference on Image, Video and Signal Processing, 2020, pp. 155–158.
  12. A. Sivasangari, V. Shivani, Y. Bindhu, D. Deepa, and R. Vignesh, “Prediction probability of getting an admission into a university using machine learning,” in 2021 5th International Conference on Computing Methodologies and Communication (ICCMC).   IEEE, 2021, pp. 1706–1709.
  13. B. Martinez Neda, Y. Zeng, and S. Gago-Masague, “Using machine learning in admissions: Reducing human and algorithmic bias in the selection process,” in Proceedings of the 52nd ACM Technical Symposium on Computer Science Education, 2021, pp. 1323–1323.
  14. A. J. Alvero, N. Arthurs, A. L. Antonio, B. W. Domingue, B. Gebre-Medhin, S. Giebel, and M. L. Stevens, “AI and holistic review: Informing human reading in college admissions,” in AIES 2020 - Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 2020, pp. 200–206. [Online]. Available: https://doi.org/10.1145/3375627.3375871
  15. T. Doleck, D. J. Lemay, R. B. Basnet, and P. Bazelais, “Predictive analytics in education: A comparison of deep learning frameworks,” Education and Information Technologies, vol. 25, no. 3, pp. 1951–1963, may 2020. [Online]. Available: https://doi.org/10.1007/s10639-019-10068-4
  16. M. Vasani, S. Patel, and J. Kaur, “Comparative analysis of baseline models, ensemble models, and deep models for prediction of graduate admission,” in Proceedings of 3rd International Conference on Machine Learning, Advances in Computing, Renewable Energy and Communication: MARC 2021.   Springer, 2022, pp. 515–525.
  17. “Personal insight questions,” https://admission.universityofcalifornia.edu/how-to-apply/applying-as-a-freshman/personal-insight-questions.html.
  18. “Simplified text processing.” [Online]. Available: https://textblob.readthedocs.io/en/dev/
  19. “Textstat.” [Online]. Available: https://textstat.readthedocs.io/en/latest/
  20. R. Flesch, “A new readability yardstick,” Journal of Applied Psychology, vol. 32, no. 3, pp. 221–233, 1948.
  21. “Simplified text processing.” [Online]. Available: https://carla.umn.edu/learnerlanguage/spn/comp/activity4.html
  22. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
  23. B. Amos, L. Xu, and J. Z. Kolter, “Input convex neural networks,” in International Conference on Machine Learning.   PMLR, 2017, pp. 146–155.
  24. A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga et al., “Pytorch: An imperative style, high-performance deep learning library,” Advances in neural information processing systems, vol. 32, 2019.
Citations (1)

Summary

We haven't generated a summary for this paper yet.