Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 60 tok/s
Gemini 2.5 Pro 40 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 87 tok/s Pro
Kimi K2 190 tok/s Pro
GPT OSS 120B 457 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

An Exploratory Study on Simulated Annealing for Feature Selection in Learning-to-Rank (2310.13269v1)

Published 20 Oct 2023 in cs.LG, cs.AI, and cs.IR

Abstract: Learning-to-rank is an applied domain of supervised machine learning. As feature selection has been found to be effective for improving the accuracy of learning models in general, it is intriguing to investigate this process for learning-to-rank domain. In this study, we investigate the use of a popular meta-heuristic approach called simulated annealing for this task. Under the general framework of simulated annealing, we explore various neighborhood selection strategies and temperature cooling schemes. We further introduce a new hyper-parameter called the progress parameter that can effectively be used to traverse the search space. Our algorithms are evaluated on five publicly benchmark datasets of learning-to-rank. For a better validation, we also compare the simulated annealing-based feature selection algorithm with another effective meta-heuristic algorithm, namely local beam search. Extensive experimental results shows the efficacy of our proposed models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. David Abramson. Constructing school timetables using simulated annealing: sequential and parallel algorithms. Management science, 37(1):98–113, 1991.
  2. A comparative evaluation of sequential feature selection algorithms. In Learning from data, pages 199–206. Springer, 1996.
  3. Feature selection for learning-to-rank using simulated annealing. International Journal of Advanced Computer Science and Applications, 11(3):699–705, 2020.
  4. Building an intrusion detection system using a filter-based feature selection algorithm. IEEE transactions on computers, 65(10):2986–2998, 2016.
  5. Modern information retrieval, volume 463. ACM press New York, 1999.
  6. Simulated annealing. Statistical science, 8(1):10–15, 1993.
  7. Swarm intelligence algorithms for feature selection: a review. Applied Sciences, 8(9):1521, 2018.
  8. Christopher JC Burges. From ranknet to lambdarank to lambdamart: An overview. Learning, 11(23-581):81, 2010.
  9. A survey on feature selection methods. Computers & Electrical Engineering, 40(1):16–28, 2014.
  10. Simulated annealing: searching for an optimal temperature schedule. SIAM Journal on Optimization, 9(3):779–802, 1999.
  11. David T Connolly. An improved annealing scheme for the qap. European Journal of Operational Research, 46(1):93–100, 1990.
  12. Robots and Biological Systems: Towards a New Bionics?: Proceedings of the NATO Advanced Workshop on Robots and Biological Systems, held at II Ciocco, Toscana, Italy, June 26–30, 1989, volume 102. Springer Science & Business Media, 2012.
  13. Sanmay Das. Filters, wrappers and a boosting-based hybrid for feature selection. In Icml, volume 1, pages 74–81. Citeseer, 2001.
  14. Feature selection for classification. Intelligent data analysis, 1(1-4):131–156, 1997.
  15. Beam search strategies for neural machine translation. arXiv preprint arXiv:1702.01806, 2017.
  16. Feature selection for ranking. In Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval, pages 407–414, 2007.
  17. Fast feature selection for learning to rank. In Proceedings of the 2016 ACM International Conference on the Theory of Information Retrieval, pages 167–170, 2016.
  18. An introduction to variable and feature selection. Journal of machine learning research, 3(Mar):1157–1182, 2003.
  19. Ohsumed: An interactive retrieval evaluation and new large test collection for research. In SIGIR’94, pages 192–201. Springer, 1994.
  20. Muhammad Ibrahim. Understanding bias and variance of learning-to-rank algorithms: An empirical framework. Applied Artificial Intelligence, 36(1):2009164, 2022.
  21. From tf-idf to learning-to-rank: An overview. In Handbook of Research on Innovations in Information Retrieval, Analysis, and Management, pages 62–109. IGI Global, USA, 2016.
  22. Machine learning: Trends, perspectives, and prospects. Science, 349(6245):255–260, 2015.
  23. Optimization by simulated annealing. science, 220(4598):671–680, 1983.
  24. Feature subset selection using the wrapper method: Overfitting and dynamic search space topology. In KDD, pages 192–197, 1995.
  25. Feature engineering and selection: A practical approach for predictive models. CRC Press, 2019.
  26. Fsmrank: Feature selection algorithm for learning to rank. IEEE transactions on neural networks and learning systems, 24(6):940–952, 2013.
  27. Parameter determination of support vector machine and feature selection using simulated annealing approach. Applied soft computing, 8(4):1505–1512, 2008.
  28. Toward integrating feature selection algorithms for classification and clustering. IEEE Transactions on knowledge and data engineering, 17(4):491–502, 2005.
  29. Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing, 260:302–312, 2017.
  30. A wrapper method for feature selection using support vector machines. Information Sciences, 179(13):2208–2217, 2009.
  31. Filtered beam search in scheduling. The International Journal Of Production Research, 26(1):35–62, 1988.
  32. Greedy and randomized feature selection for web search ranking. In 2011 IEEE 11th International Conference on Computer and Information Technology, pages 436–442. IEEE, 2011.
  33. Introducing letor 4.0 datasets. arXiv preprint arXiv:1306.2597, 2013.
  34. Letor: A benchmark collection for research on learning to rank for information retrieval. Information Retrieval, 13(4):346–374, 2010.
  35. Deep neural network regularization for feature selection in learning-to-rank. IEEE Access, 7:53988–54006, 2019.
  36. Glossary of metaheuristic algorithms. Int. J. Comput. Inf. Syst. Ind. Manag. Appl, 9:181–205, 2017.
  37. Artificial intelligence: a modern approach. 2002.
  38. Feature engineering in learning-to-rank for community question answering task. arXiv preprint arXiv:2309.07610, 2023.
  39. Filter methods for feature selection–a comparative study. In International Conference on Intelligent Data Engineering and Automated Learning, pages 178–187. Springer, 2007.
  40. A comprehensive analysis of nature-inspired meta-heuristic techniques for feature selection problem. Archives of Computational Methods in Engineering, pages 1–25, 2020.
  41. On automatic feature selection. In Handbook of pattern recognition and computer vision, pages 63–87. World Scientific, 1993.
  42. Risk-sensitive learning to rank with evolutionary multi-objective feature selection. ACM Transactions on Information Systems (TOIS), 37(2):1–34, 2019.
  43. Fast simulated annealing. Physics letters A, 122(3-4):157–162, 1987.
  44. Simulated annealing. In Simulated annealing: Theory and applications, pages 7–15. Springer, 1987.
  45. René VV Vidal. Applied simulated annealing, volume 396. Springer, 1993.
  46. Solving travelling salesman problem using multiagent simulated annealing algorithm with instance-based sampling. International Journal of Computing Science and Mathematics, 6(4):336–353, 2015.
  47. Feature subset selection using a genetic algorithm. In Feature extraction, construction and selection, pages 117–136. Springer, 1998.
  48. Xin-She Yang. Harmony search as a metaheuristic algorithm. In Music-inspired harmony search algorithm, pages 1–14. Springer, 2009.
  49. List-based simulated annealing algorithm for traveling salesman problem. Computational intelligence and neuroscience, 2016, 2016.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube