Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Early Churn Prediction from Large Scale User-Product Interaction Time Series (2309.14390v1)

Published 25 Sep 2023 in cs.LG and cs.AI

Abstract: User churn, characterized by customers ending their relationship with a business, has profound economic consequences across various Business-to-Customer scenarios. For numerous system-to-user actions, such as promotional discounts and retention campaigns, predicting potential churners stands as a primary objective. In volatile sectors like fantasy sports, unpredictable factors such as international sports events can influence even regular spending habits. Consequently, while transaction history and user-product interaction are valuable in predicting churn, they demand deep domain knowledge and intricate feature engineering. Additionally, feature development for churn prediction systems can be resource-intensive, particularly in production settings serving 200m+ users, where inference pipelines largely focus on feature engineering. This paper conducts an exhaustive study on predicting user churn using historical data. We aim to create a model forecasting customer churn likelihood, facilitating businesses in comprehending attrition trends and formulating effective retention plans. Our approach treats churn prediction as multivariate time series classification, demonstrating that combining user activity and deep neural networks yields remarkable results for churn prediction in complex business-to-customer contexts.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (47)
  1. An application of support vector machines for customer churn analysis: Credit card case. In International Conference on Natural Computation, pages 636–647. Springer, 2005.
  2. Customer churn prediction using improved one-class support vector machine. In International conference on advanced data mining and applications, pages 300–306. Springer, 2005.
  3. A comparison of machine learning techniques for customer churn prediction. Simulation Modelling Practice and Theory, 55:1–9, 2015.
  4. A proposed churn prediction model. International Journal of Engineering Research and Applications, 2(4):693–697, 2012.
  5. A new hybrid classification algorithm for customer churn prediction based on logistic regression and decision trees. European Journal of Operational Research, 269(2):760–772, 2018.
  6. Churn prediction in telecommunication using logistic regression and logit boost. Procedia Computer Science, 167:101–112, 2020.
  7. Customer churning prediction using support vector machines in online auto insurance service. In International Symposium on Neural Networks, pages 928–933. Springer, 2005.
  8. A survey on churn analysis in various business domains. IEEE Access, 8:220816–220839, 2020.
  9. A Saran Kumar and D Chandrakala. An optimal churn prediction model using support vector machine with adaboost. Int. J. Sci. Res. Comput. Sci. Eng. Inf. Technol, 2(1):225–230, 2017.
  10. A survey on ensemble learning. Frontiers of Computer Science, 14(2):241–258, 2020.
  11. Xiaohua Hu. A data mining approach for retailing bank customer attrition analysis. Applied Intelligence, 22(1):47–60, 2005.
  12. The application ofadaboost in customer churn prediction. In 2007 International Conference on Service Systems and Service Management, pages 1–6. IEEE, 2007.
  13. Predicting credit card customer churn in banks using data mining. Int. J. Data Anal. Tech. Strateg., 1(1):4–28, 2008.
  14. An effective classifier for predicting churn in telecommunication. Jour of Adv Research in Dynamical & Control Systems, 11, 2019.
  15. A churn prediction model using random forest: analysis of machine learning techniques for churn prediction and factor identification in telecom sector. IEEE access, 7:60134–60149, 2019.
  16. Customer churn prediction using improved balanced random forests. Expert Systems with Applications, 36(3):5445–5449, 2009.
  17. A customer churn prediction model based on xgboost and mlp. In 2020 International Conference on Computer Engineering and Application (ICCEA), pages 608–612. IEEE, 2020.
  18. Comparing to techniques used in customer churn analysis. Journal of Multidisciplinary Developments, 4(1):30–38, 2019.
  19. Experimental analysis of hyperparameters for deep learning-based churn prediction in the banking sector. Computation, 9(3):34, 2021.
  20. Deep learning as a vector embedding model for customer churn. Procedia Computer Science, 179:624–631, 2021.
  21. M Panjasuchat and Y Limpiyakorn. Applying reinforcement learning for customer churn prediction. In Journal of Physics: Conference Series, volume 1619, page 012016. IOP Publishing, 2020.
  22. Deep learning based customer churn analysis. In 2019 11th International Conference on Wireless Communications and Signal Processing (WCSP), pages 1–6. IEEE, 2019.
  23. Churn analysis using deep convolutional neural networks and autoencoders. arXiv preprint arXiv:1604.05377, 2016.
  24. Customer churn prediction in telecommunication industry using deep learning. Information Sciences Letters, 11(1):24, 2022.
  25. V Umayaparvathi and K Iyakutti. Automated feature selection and churn prediction using deep learning models. International Research Journal of Engineering and Technology (IRJET), 4(3):1846–1854, 2017.
  26. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 25, 2012.
  27. Going deeper with convolutions. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1–9, 2015.
  28. Rethinking the inception architecture for computer vision. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 2818–2826, 2016.
  29. Inception-v4, inception-resnet and the impact of residual connections on learning. In Thirty-first AAAI conference on artificial intelligence, 2017.
  30. Convnext-backbone hovernet for nuclei segmentation and classification. arXiv preprint arXiv:2202.13560, 2022.
  31. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  32. Swin transformer: Hierarchical vision transformer using shifted windows. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 10012–10022, 2021.
  33. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
  34. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
  35. Multi-scale convolutional neural networks for time series classification. arXiv preprint arXiv:1603.06995, 2016.
  36. Yahui Chen. Convolutional neural network for sentence classification. Master’s thesis, University of Waterloo, 2015.
  37. Deep learning for time series classification: a review. Data mining and knowledge discovery, 33(4):917–963, 2019.
  38. Cnnpred: Cnn-based stock market prediction using a diverse set of variables. Expert Systems with Applications, 129:273–285, 2019.
  39. Improving data-driven global weather prediction using deep convolutional neural networks on a cubed sphere. Journal of Advances in Modeling Earth Systems, 12(9):e2020MS002109, 2020.
  40. Cnn-cass: Cnn for classification of coronary artery stenosis score in mpr images. arXiv preprint arXiv:2001.08593, 2020.
  41. Inceptiontime: Finding alexnet for time series classification. Data Mining and Knowledge Discovery, 34(6):1936–1962, 2020.
  42. A convnet for the 2020s. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 11976–11986, 2022.
  43. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929, 2020.
  44. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  45. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In International conference on machine learning, pages 448–456. PMLR, 2015.
  46. Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research, 15(1):1929–1958, 2014.
  47. Horovod: fast and easy distributed deep learning in TensorFlow. arXiv preprint arXiv:1802.05799, 2018.
Citations (1)

Summary

We haven't generated a summary for this paper yet.