Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LSTSVR-PI: Least square twin support vector regression with privileged information (2312.02596v2)

Published 5 Dec 2023 in cs.LG

Abstract: In an educational setting, a teacher plays a crucial role in various classroom teaching patterns. Similarly, mirroring this aspect of human learning, the learning using privileged information (LUPI) paradigm introduces additional information to instruct learning models during the training stage. A different approach to train the twin variant of the regression model is provided by the new least square twin support vector regression using privileged information (LSTSVR-PI), which integrates the LUPI paradigm to utilize additional sources of information into the least square twin support vector regression. The proposed LSTSVR-PI solves system of linear equations which adds up to the efficiency of the model. Further, we also establish a generalization error bound based on the Rademacher complexity of the proposed model and incorporate the structural risk minimization principle. The proposed LSTSVR-PI fills the gap between the contemporary paradigm of LUPI and classical LSTSVR. Further, to assess the performance of the proposed model, we conduct numerical experiments along with the baseline models across various artificially generated and real-world datasets. The various experiments and statistical analysis infer the superiority of the proposed model. Moreover, as an application, we conduct experiments on time series datasets, which results in the superiority of the proposed LSTSVR-PI.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (55)
  1. Vladimir N Vapnik. An overview of statistical learning theory. IEEE Transactions on Neural Networks, 10(5):988–999, 1999.
  2. Adaboost and SVM based cybercrime detection and prevention model. Artif. Intell. Res., 1(2):117–130, 2012.
  3. A robust algorithm of multiquadric method based on an improved huber loss function for interpolating remote-sensing-derived elevation data sets. Remote Sensing, 7(3):3347–3371, 2015.
  4. Robust real-time heart rate prediction for multiple subjects from facial video using compressive tracking and support vector machine. Journal of Medical Imaging, 5(2):024503–024503, 2018.
  5. Modeling suspended sediment load in a river using extreme learning machine and twin support vector regression with wavelet conjunction. Environmental Earth Sciences, 79:1–15, 2020.
  6. A tutorial on support vector regression. Statistics and Computing, 14:199–222, 2004.
  7. Support vector regression machines. Advances in Neural Information Processing Systems, 9, 1996.
  8. Soft sensing of particle size in a grinding process: Application of support vector regression, fuzzy inference and adaptive neuro fuzzy inference techniques for online monitoring of cement fineness. Powder Technology, 264:484–497, 2014.
  9. Travel-time prediction with support vector regression. IEEE Transactions on Intelligent Transportation Systems, 5(4):276–281, 2004.
  10. Location estimation via support vector regression. IEEE Transactions on Mobile Computing, 6(3):311–321, 2007.
  11. Twin support vector machines for pattern classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(5):905–910, 2007.
  12. Comprehensive review on twin support vector machines. Annals of Operations Research, pages 1–46, 2022.
  13. Xinjun Peng. TSVR: an efficient twin support vector machine for regression. Neural Networks, 23(3):365–372, 2010.
  14. An ε𝜀\varepsilonitalic_ε-twin support vector machine for regression. Neural Computing and Applications, 23:175–185, 2013.
  15. A weighted twin support vector regression. Knowledge-Based Systems, 33:92–101, 2012.
  16. Twin least squares support vector regression. Neurocomputing, 118:225–236, 2013.
  17. A twin projection support vector machine for data regression. Neurocomputing, 138:131–141, 2014.
  18. A geometric approach to support vector regression. Neurocomputing, 55(1-2):79–108, 2003.
  19. TWSVR: regression via twin support vector machine. Neural Networks, 74:14–21, 2016.
  20. An overview on twin support vector regression. Neurocomputing, 490:80–92, 2022.
  21. Robust twin support vector regression with smooth truncated H ε𝜀\varepsilonitalic_ε loss function. Neural Processing Letters, pages 1–45, 2023.
  22. On robust twin support vector regression in primal using squared pinball loss. Journal of Intelligent & Fuzzy Systems, 35(5):5231–5239, 2018.
  23. On robust asymmetric lagrangian ν𝜈\nuitalic_ν-twin support vector regression using pinball loss function. Applied Soft Computing, 102:107099, 2021a.
  24. Smooth pinball loss nonparallel support vector machine for robust classification. Applied Soft Computing, 98:106840, 2021.
  25. Twin support vector regression with huber loss. Journal of Intelligent & Fuzzy Systems, 32(6):4247–4258, 2017.
  26. S Balasundaram and Subhash Chandra Prasad. Robust twin support vector regression based on huber loss function. Neural Computing and Applications, 32:11285–11309, 2020.
  27. On regularization based twin support vector regression with huber loss. Neural Processing Letters, 53(1):459–515, 2021b.
  28. Twin support vector quantile regression. arXiv preprint arXiv:2305.03894, 2023.
  29. Wavelet kernel least square twin support vector regression for wind speed prediction. Environmental Science and Pollution Research, 29(57):86320–86336, 2022.
  30. Brain age prediction using improved twin SVR. Neural Computing and Applications, pages 1–11, 2022a.
  31. Brain age prediction with improved least squares twin SVR. IEEE Journal of Biomedical and Health Informatics, 27(4):1661–1669, 2022b.
  32. Nidhi Nidhi and DK Lobiyal. Traffic flow prediction using support vector regression. International Journal of Information Technology, 14(2):619–626, 2022.
  33. A new learning paradigm: Learning using privileged information. Neural Networks, 22(5-6):544–557, 2009.
  34. Privileged information for data clustering. Information Sciences, 194:4–23, 2012.
  35. Information bottleneck learning using privileged information for visual recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 1496–1505, 2016.
  36. Coupling privileged kernel method for multi-view learning. Information Sciences, 481:110–127, 2019.
  37. Distance metric learning using privileged information for face verification and person re-identification. IEEE Transactions on Neural Networks and Learning Systems, 26(12):3150–3162, 2015.
  38. Person re-identification with metric learning using privileged information. IEEE Transactions on Image Processing, 27(2):791–805, 2017.
  39. Nonlinear l1subscript𝑙1l_{1}italic_l start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT support vector machines for learning using privileged information. In 2012 IEEE 12th International Conference on Data Mining Workshops, pages 495–499. IEEE, 2012.
  40. R-SVM+: Robust learning with privileged information. In IJCAI, pages 2411–2417, 2018.
  41. Twin support vector machines with privileged information. Information Sciences, 573:141–153, 2021.
  42. On the construction of extreme learning machine for online and offline one-class classification—an expanded toolbox. Neurocomputing, 261:126–143, 2017.
  43. Vladimir Vovk. Kernel ridge regression. In Empirical Inference: Festschrift in Honor of Vladimir N. Vapnik, pages 105–116. Springer, 2013.
  44. Random vector functional link network: recent developments, applications, and future directions. Applied Soft Computing, page 110377, 2023.
  45. AEKOC+: Kernel ridge regression-based auto-encoder for one-class classification using privileged information. Cognitive Computation, 12:412–425, 2020.
  46. KOC+: Kernel ridge regression based one-class classification using privileged information. Information Sciences, 504:324–333, 2019.
  47. A new learning paradigm for random vector functional-link network: RVFL+. Neural Networks, 122:94–105, 2020.
  48. Successive overrelaxation for support vector machines. IEEE Transactions on Neural Networks, 10(5):1032–1037, 1999.
  49. Error bounds and convergence analysis of feasible descent methods: a general approach. Annals of Operations Research, 46(1):157–178, 1993.
  50. Uniqueness of the svm solution. Advances in Neural Information Processing Systems, 12, 1999.
  51. UCI machine learning repository. 2017.
  52. Muhammad Tanveer and K Shubham. A regularization on lagrangian twin support vector regression. International Journal of Machine Learning and Cybernetics, 8(3):807–821, 2017.
  53. An efficient regularized k-nearest neighbor based weighted twin support vector regression. Knowledge-Based Systems, 94:70–87, 2016.
  54. Janez Demšar. Statistical comparisons of classifiers over multiple data sets. The Journal of Machine Learning Research, 7:1–30, 2006.
  55. KEEL data-mining software tool: Data set repository, integration of algorithms and experimental analysis framework. J. Mult. Valued Log. Soft Comput, 17:255–287, 2015.

Summary

We haven't generated a summary for this paper yet.