Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 43 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 464 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Rectified Gaussian kernel multi-view k-means clustering (2405.05619v3)

Published 9 May 2024 in cs.LG and cs.CV

Abstract: In this paper, we show two new variants of multi-view k-means (MVKM) algorithms to address multi-view data. The general idea is to outline the distance between $h$-th view data points $x_ih$ and $h$-th view cluster centers $a_kh$ in a different manner of centroid-based approach. Unlike other methods, our proposed methods learn the multi-view data by calculating the similarity using Euclidean norm in the space of Gaussian-kernel, namely as multi-view k-means with exponent distance (MVKM-ED). By simultaneously aligning the stabilizer parameter $p$ and kernel coefficients $\betah$, the compression of Gaussian-kernel based weighted distance in Euclidean norm reduce the sensitivity of MVKM-ED. To this end, this paper designated as Gaussian-kernel multi-view k-means (GKMVKM) clustering algorithm. Numerical evaluation of five real-world multi-view data demonstrates the robustness and efficiency of our proposed MVKM-ED and GKMVKM approaches.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. James MacQueen. Some methods for classification and analysis of multivariate observations. In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability of, vol. 1, no. 14, pages 281–297. 1967, Oakland, CA, USA.
  2. FCM: The fuzzy c-means clustering algorithm. Computers & geosciences, vol. 10, no. 2-3, pages 191–203. Elsevier, 1984.
  3. Raghuram Krishnapuram and James M. Keller. A possibilistic approach to clustering. IEEE transactions on fuzzy systems, vol. 1, no. 2, pages 98–110. IEEE, 1993.
  4. Using a fuzzy clustering chaotic-based differential evolution with serial method to solve resource-constrained project scheduling problems. Automation in Construction, vol. 37, pages 88–97. Elsevier, 2014.
  5. Optimal mass transport: Signal processing and machine-learning applications. IEEE signal processing magazine, vol. 34, no. 4, pages 43–59. IEEE, 2017.
  6. Optimization of a truck-drone in tandem delivery network using k-means and genetic algorithm. Journal of Industrial Engineering and Management, vol. 9, no. 2, pages 374–388. Omnia Science, 2016.
  7. Medical image segmentation using k-means clustering and improved watershed algorithm. In 2006 IEEE southwest symposium on image analysis and interpretation of, pages 61–65. IEEE, 2006.
  8. An improved overlapping k-means clustering method for medical applications. Expert Systems with Applications, vol. 67, pages 12–18. Elsevier, 2017.
  9. Cramen Alina Lupaşcu and Domenico Tegolo. Automatic unsupervised segmentation of retinal vessels using self-organizing maps and k-means clustering. In Computational Intelligence Methods for Bioinformatics and Biostatistics: 7th International Meeting, CIBB 2010, Palermo, Italy, September 16-18, 2010, Revised Selected Papers 7 of, pages 263–274. Springer, 2011.
  10. Image segmentation using K-means clustering algorithm and subtractive clustering algorithm. Procedia Computer Science, vol. 54, pages 764–771. Elsevier, 2015.
  11. Segmentation and identification of vertebrae in CT scans using CNN, k-means clustering and k-NN. Information Sciences, vol. 521, pages 14–31. Elsevier, 2020.
  12. Privacy-preserving federated k-means for proactive caching in next generation cellular networks. Informatics, vol. 8, no. 2, pages 40. MDPI, 2021.
  13. Multi-layer manifold learning for deep non-negative matrix factorization-based multi-view clustering. Pattern Recognition, vol. 131, pages 108815. Elsevier, 2022.
  14. Kristina P. Sinaga and Miin-Shen Yang. Unsupervised K-means clustering algorithm. IEEE access, vol. 8, pages 80716–80727. IEEE, 2020.
  15. Alternative c-means clustering algorithms. Pattern recognition, vol. 35, no. 10, pages 2267–2278. Elsevier, 2002.
  16. Shou-Jen Chang-Chien, Yessica Nataliani and Miin-Shen Yang. Gaussian-kernel c-means clustering algorithms. Soft Computing, vol. 25, no. 3, pages 1699–1716. Springer, 2021.
  17. Laurens Van der Maaten and Geoffrey Hinton. Visualizing data using t-SNE. Journal of machine learning research, vol. 9, no. 11, 2008.
  18. Wasserstein k-means with sparse simplex projection. In 2020 25th International Conference on Pattern Recognition (ICPR) of, pages 1627–1634. IEEE, 2021.
  19. Leonid Nisonovich Vaserstein. Markov processes over denumerable products of spaces, describing large systems of automata. Problemy Peredachi Informatsii, vol. 5, no. 3, pages 64–72. Russian Academy of Sciences, Branch of Informatics, Computer Equipment and …, 1969.
  20. Mémoli, Facundo Gromov–Wasserstein distances and the metric approach to object matching. Foundations of computational mathematics, vol. 11, pages 417–487. Springer, 2011.
  21. Julie Delon, Agnes Desolneux and Antoine Salmona. Gromov–Wasserstein distances between Gaussian distributions. Journal of Applied Probability, vol. 59, no. 4, pages 1178–1198. Cambridge University Press, 2022.
  22. Hypothesis test and confidence analysis with wasserstein distance on general dimension. Neural Computation, vol. 34, no. 6, pages 1448–1487. MIT Press One Rogers Street, Cambridge, MA 02142-1209, USA journals-info …, 2022.
  23. The global k-means clustering algorithm. Pattern recognition, vol. 36, no. 2, pages 451–461. Elsevier, 2003.
  24. TW-k-means: Automated two-level variable weighting clustering algorithm for multiview data. IEEE Transactions on Knowledge and Data Engineering, vol. 25, no. 4, pages 932–944. IEEE, 2011.
  25. Automated variable weighting in k-means type clustering. IEEE transactions on pattern analysis and machine intelligence, vol. 27, no. 5, pages 657–668. IEEE, 2005.
  26. Weighted multi-view clustering with feature selection. Pattern Recognition, vol. 53, pages 25–35. Elsevier, 2016.
  27. One-hundred plant species leaves data set. UCI Machine Learning Repository, 2012.
  28. Multi-view clustering via simultaneous weighting on views and features. Applied Soft Computing, vol. 47, pages 304–315. Elsevier, 2016.
  29. TW-Co-k-means: Two-level weighted collaborative k-means for multi-view clustering. Knowledge-Based Systems, vol. 150, pages 127–138. Elsevier, 2018.
  30. Adaptive multi-view feature selection for human motion retrieval. Signal Processing, vol. 120, pages 691–701. Elsevier, 2016.
  31. Shape and texture based plant leaf classification. In Advanced Concepts for Intelligent Vision Systems: 12th International Conference, ACIVS 2010, Sydney, Australia, December 13-16, 2010, Proceedings, Part II 12, pages 345–353. Springer, 2010.
  32. Nus-wide: a real-world web image database from national university of singapore. In Proceedings of the ACM international conference on image and video retrieval, pages 1–9, 2009.
  33. Xiao Cai, Feiping Nie and Heng Huang. Multi-view k-means clustering on big data. In Twenty-Third International Joint conference on artificial intelligence, 2013.
  34. Comparing partitions. Journal of classification, vol. 2, pages 193–218. Springer, 1985.
  35. Christopher D Manning, Prabhakar Raghavan and Hinrich Schütze. Introduction to information retrieval. Cambridge university press, 2008.
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)

X Twitter Logo Streamline Icon: https://streamlinehq.com