Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 175 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 38 tok/s Pro
GPT-4o 92 tok/s Pro
Kimi K2 218 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Selective inference for multiple pairs of clusters after K-means clustering (2405.16379v1)

Published 25 May 2024 in stat.ME

Abstract: If the same data is used for both clustering and for testing a null hypothesis that is formulated in terms of the estimated clusters, then the traditional hypothesis testing framework often fails to control the Type I error. Gao et al. [2022] and Chen and Witten [2023] provide selective inference frameworks for testing if a pair of estimated clusters indeed stem from underlying differences, for the case where hierarchical clustering and K-means clustering, respectively, are used to define the clusters. In applications, however, it is often of interest to test for multiple pairs of clusters. In our work, we extend the pairwise test of Chen and Witten [2023] to a test for multiple pairs of clusters, where the cluster assignments are produced by K-means clustering. We further develop an analogous test for the setting where the variance is unknown, building on the work of Yun and Barber [2023] that extends Gao et al. [2022]'s pairwise test to the case of unknown variance. For both known and unknown variance settings, we present methods that address certain forms of data-dependence in the choice of pairs of clusters to test for. We show that our proposed tests control the Type I error, both theoretically and empirically, and provide a numerical study of their empirical powers under various settings.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)
  1. Selective inference after convex clustering with \ℓ⁢_⁢1\absentℓ_1\backslash\ell\_1\ roman_ℓ _ 1 penalization. arXiv preprint arXiv:2309.01492, 2023.
  2. Yoav Benjamini. Selective inference: The silent killer of replicability. Harvard Data Science Review, 2020.
  3. Center for High Throughput Computing. Center for high throughput computing, 2006. URL https://chtc.cs.wisc.edu/.
  4. Testing for a difference in means of a single feature after clustering. arXiv preprint arXiv:2311.16375, 2023.
  5. Selective inference for k-means clustering. J. Mach. Learn. Res., 24:152–1, 2023.
  6. Optimal inference after model selection. arXiv preprint arXiv:1410.2597, 2014.
  7. Selective inference for hierarchical clustering. Journal of the American Statistical Association, pages 1–11, 2022.
  8. Post-clustering inference under dependency. arXiv preprint arXiv:2310.11822, 2023.
  9. Post-clustering difference testing: valid inference and practical considerations. arXiv preprint arXiv:2210.13172, 2022.
  10. palmerpenguins: Palmer archipelago (antarctica) penguin data. R package version 0.1. 0, 2020.
  11. An approximation to the f distribution using the chi-square distribution. Computational statistics & data analysis, 40(1):21–26, 2002.
  12. Stuart Lloyd. Least squares quantization in PCM. IEEE transactions on information theory, 28(2):129–137, 1982.
  13. Inference after latent variable estimation for single-cell rna sequencing data. Biostatistics, 25(1):270–287, 2024.
  14. Inferring independent sets of gaussian variables after thresholding correlations. Journal of the American Statistical Association, (just-accepted):1–20, 2024.
  15. Selective inference for latent block models. Electronic Journal of Statistics, 15(1):3137–3183, 2021.
  16. Selective inference for clustering with unknown variance. Electronic Journal of Statistics, 17(2):1923–1946, 2023.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: