Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 92 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 11 tok/s
GPT-5 High 14 tok/s Pro
GPT-4o 99 tok/s
GPT OSS 120B 462 tok/s Pro
Kimi K2 192 tok/s Pro
2000 character limit reached

Convex relaxation for the generalized maximum-entropy sampling problem (2404.01390v2)

Published 1 Apr 2024 in math.ST, math.OC, and stat.TH

Abstract: The generalized maximum-entropy sampling problem (GMESP) is to select an order-$s$ principal submatrix from an order-$n$ covariance matrix, to maximize the product of its $t$ greatest eigenvalues, $0<t\leq s <n$. Introduced more than 25 years ago, GMESP is a natural generalization of two fundamental problems in statistical design theory: (i) maximum-entropy sampling problem (MESP); (ii) binary D-optimality (D-Opt). In the general case, it can be motivated by a selection problem in the context of principal component analysis (PCA). We introduce the first convex-optimization based relaxation for GMESP, study its behavior, compare it to an earlier spectral bound, and demonstrate its use in a branch-and-bound scheme. We find that such an approach is practical when $s-t$ is very small.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. Kurt M. Anstreicher. Maximum-entropy sampling and the Boolean quadric polytope. Journal of Global Optimization, 72(4):603–618, 2018. https://doi.org/10.1007/s10898-018-0662-x.
  2. Kurt M. Anstreicher. Efficient solution of maximum-entropy sampling problems. Operations Research, 68(6):1826–1835, 2020. https://doi.org/10.1287/opre.2019.1962.
  3. Continuous relaxations for constrained maximum-entropy sampling. In Integer programming and Combinatorial Optimization (Vancouver, BC, 1996), volume 1084 of Lecture Notes in Comput. Sci., pages 234–248. Springer, Berlin, 1996. https://doi.org/10.1007/3-540-61310-2_18.
  4. Using continuous nonlinear relaxations to solve constrained maximum-entropy sampling problems. Mathematical Programming, Series A, 85(2):221–240, 1999. https://doi.org/10.1007/s101070050055.
  5. A masked spectral bound for maximum-entropy sampling. In mODa 7—Advances in Model-Oriented Design and Analysis, Contrib. Statist., pages 1–12. Physica, Heidelberg, 2004. https://doi.org/10.1007/978-3-7908-2693-7_1.
  6. Solving maximum-entropy sampling problems using factored masks. Mathematical Programming, 109(2-3, Ser. B):263–281, 2007. https://doi.org/10.1007/s10107-006-0024-1.
  7. Mixing convex-optimization bounds for maximum-entropy sampling. Mathematical Programming, Series B, 188:539–568, 2021. https://doi.org/10.1007/s10107-020-01588-w.
  8. On computing with some convex relaxations for the maximum-entropy sampling problem. INFORMS Journal on Computing, 35(2):368–385, 2023. https://doi.org/10.1287/ijoc.2022.1264.
  9. Maximum-Entropy Sampling: Algorithms and Application. Springer, 2022. https://doi.org/10.1007/978-3-031-13078-6.
  10. Using entropy in the redesign of an environmental monitoring network. In G.P. Patil, C.R. Rao, and N.P. Ross, editors, Multivariate Environmental Statistics, volume 6, pages 175–202. North-Holland, 1993.
  11. New upper bounds for maximum-entropy sampling. In mODa 6—Advances in Model-Oriented Design and Analysis (Puchberg/Schneeberg, 2001), Contrib. Statist., pages 143–153. Physica, Heidelberg, 2001. https://doi.org/10.1007/978-3-642-57576-1_16.
  12. Principal component analysis: A review and recent developments. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374:20150202, 04 2016. https://doi.org/10.1098/rsta.2015.0202.
  13. An exact algorithm for maximum entropy sampling. Oper. Res., 43(4):684–691, 1995. https://doi.org/10.1287/opre.43.4.684|.
  14. Jon Lee. Constrained maximum-entropy sampling. Operations Research, 46(5):655–664, 1998. https://doi.org/10.1287/opre.46.5.655.
  15. Jon Lee. A First Course in Linear Optimization (Fourth Edition, version 4.07). Reex Press, 2013–22. https://github.com/jon77lee/JLee_LinearOptimizationBook .
  16. Generalized maximum-entropy sampling. INFOR: Information Systems and Oper. Res., 58(2):168–181, 2020. https://doi.org/10.1080/03155986.2018.1533774.
  17. A linear integer programming bound for maximum-entropy sampling. Mathematical Programming, Series B, 94(2–3):247–256, 2003. https://doi.org/10.1007/s10107-002-0318-x.
  18. D-optimal data fusion: Exact and approximation algorithms. INFORMS Journal on Computing, 36(1):97–120, 2024. https://doi.org/10.1287/ijoc.2022.0235.
  19. Best principal submatrix selection for the maximum entropy sampling problem: scalable algorithms and performance guarantees. Operations Research, 2023. https://doi.org/10.1287/opre.2023.2488.
  20. Aleksandar Nikolov. Randomized rounding for the largest simplex problem. In STOC 2015, pages 861–870, 2015. https://doi.org/10.1145/2746539.2746628.
  21. Branch-and-bound for D-optimality with fast local search and variable-bound tightening, 2023. Preprint: https://arxiv.org/abs/2309.00117.
  22. Claude E. Shannon. A mathematical theory of communication. The Bell System Technical Journal, 27(3):379–423, 1948. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x.
  23. Statistical Methods. Iowa State University Press, Ames, IA, sixth edition, 1967.
  24. Joy Denise Williams. Spectral Bounds for Entropy Models. Ph.D. thesis, University of Kentucky, April 1998. https://saalck-uky.primo.exlibrisgroup.com/permalink/01SAA_UKY/15remem/alma9914832986802636.
Citations (4)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com