Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Wasserstein Nonnegative Tensor Factorization with Manifold Regularization (2401.01842v1)

Published 3 Jan 2024 in cs.LG

Abstract: Nonnegative tensor factorization (NTF) has become an important tool for feature extraction and part-based representation with preserved intrinsic structure information from nonnegative high-order data. However, the original NTF methods utilize Euclidean or Kullback-Leibler divergence as the loss function which treats each feature equally leading to the neglect of the side-information of features. To utilize correlation information of features and manifold information of samples, we introduce Wasserstein manifold nonnegative tensor factorization (WMNTF), which minimizes the Wasserstein distance between the distribution of input tensorial data and the distribution of reconstruction. Although some researches about Wasserstein distance have been proposed in nonnegative matrix factorization (NMF), they ignore the spatial structure information of higher-order data. We use Wasserstein distance (a.k.a Earth Mover's distance or Optimal Transport distance) as a metric and add a graph regularizer to a latent factor. Experimental results demonstrate the effectiveness of the proposed method compared with other NMF and NTF methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. Nonnegative matrix and tensor factorizations: applications to exploratory multi-way data analysis and blind source separation, John Wiley & Sons, 2009.
  2. “Using low-rank representation of abundance maps and nonnegative tensor factorization for hyperspectral nonlinear unmixing,” IEEE Trans. Geosci. Remote Sens., vol. 60, pp. 1–17, 2021.
  3. “Multichannel blind source separation based on evanescent-region-aware non-negative tensor factorization in spherical harmonic domain,” IEEE Trans. Audio, Speech, Lang. Process., vol. 29, pp. 607–617, 2020.
  4. “Semantic social network analysis by cross-domain tensor factorization,” IEEE Trans. Comput. Social Syst., vol. 4, no. 4, pp. 207–217, 2017.
  5. “Scalable and robust tensor decomposition of spontaneous stereotactic eeg data,” IEEE Trans. Biomed. Eng., vol. 66, no. 6, pp. 1549–1558, 2018.
  6. “Tensor rank estimation and completion via cp-based nuclear norm,” in Proc. ACM Conf. Inf. Knowl. Manage., Nov. 2017, pp. 949–958.
  7. “Feature extraction for incomplete data via low-rank tensor decomposition with feature regularization,” IEEE Trans. Neural Netw. Learn. Syst., vol. 30, no. 6, pp. 1803–1817, Jun. 2018.
  8. “Learning nonnegative factors from tensor data: Probabilistic modeling and inference algorithm,” IEEE Trans. Signal Process., vol. 68, pp. 1792–1806, 2020.
  9. “Bayesian poisson tucker decomposition for learning the structure of international relations,” in Proc. 33th ICML. PMLR, 2016, pp. 2810–2819.
  10. “Bayesian nonparametric models for multiway data analysis,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 37, no. 2, pp. 475–487, 2013.
  11. “Fast nonnegative matrix/tensor factorization based on low-rank approximation,” IEEE Trans. Signal Process., vol. 60, no. 6, pp. 2928–2940, 2012.
  12. “Newton-based optimization for kullback–leibler nonnegative tensor factorizations,” Optim. Methods Softw., vol. 30, no. 5, pp. 1002–1029, 2015.
  13. “Non-negative tensor factorization using alpha and beta divergences,” in Proc. IEEE ICASSP. IEEE, 2007, vol. 3, pp. III–1393.
  14. “Fast dictionary learning with a smoothed wasserstein loss,” in Proc. 19th Int. Conf. Artif. Intell. Stat. PMLR, 2016, pp. 630–638.
  15. “Graph regularized nonnegative matrix factorization for data representation,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 33, no. 8, pp. 1548–1560, 2010.
  16. “The earth mover’s distance as a metric for image retrieval,” Int. J. Comput. Vis., vol. 40, no. 2, pp. 99–121, 2000.
  17. M. Cuturi, “Sinkhorn distances: Lightspeed computation of optimal transport,” Proc. NIPS., vol. 26, 2013.
  18. “Learning with a wasserstein loss,” Proc. NIPS., vol. 28, 2015.
  19. R. Sandler and M. Lindenbaum, “Nonnegative matrix factorization with earth mover’s distance metric for image analysis,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 33, no. 8, pp. 1590–1602, 2011.
  20. “Optimal spectral transportation with application to music transcription,” Proc. NIPS., vol. 29, 2016.
  21. “Non-negative matrix factorization with sinkhorn distance.,” in IJCAI, 2016, pp. 1960–1966.
  22. “Multi-level metric learning via smoothed wasserstein distance.,” in IJCAI, 2018, pp. 2919–2925.
  23. “Swift: Scalable wasserstein factorization for sparse nonnegative tensors,” in Proc. AAAI Conf., 2021.
  24. S. Y. Zhang, “A unified framework for non-negative matrix and tensor factorisations with a smoothed wasserstein loss,” in Proc. ICCV, 2021, pp. 4195–4203.
  25. C. Févotte and J. Idier, “Algorithms for nonnegative matrix factorization with the β𝛽\betaitalic_β-divergence,” Neural Comput., vol. 23, no. 9, pp. 2421–2456, 2011.
  26. “Tensor decompositions and applications,” SIAM Rev., vol. 51, no. 3, pp. 455–500, 2009.
  27. R. A. Harshman, “Foundations of the parafac procedure: Models and conditions for an” explanatory” multimodal factor analysis,” 1970.
  28. “Columbia object image library (coil-20),” Tech. Rep. CUCS-005-96, Department of Computer Science, Columbia University, February 1996.
  29. “The cmu pose, illumination, and expression (pie) database,” in Proc. IEEE FG, 2002, pp. 53–58.
  30. “Algorithm as 136: A k-means clustering algorithm,” Journal of the Royal Statistical Society., vol. 28, no. 1, pp. 100–108, 1979.
  31. D. Lee and H. S. Seung, “Algorithms for non-negative matrix factorization,” Adv. Neural Inf. Process. Syst., vol. 13, 2000.
  32. A. Shashua and T. Hazan, “Non-negative tensor factorization with applications to statistics and computer vision,” in Proc. 22nd Int. Conf. Mach. Learn., 2005, pp. 792–799.
  33. “Image representation using laplacian regularized nonnegative tensor factorization,” Pattern Recognition, vol. 44, no. 10-11, pp. 2516–2526, 2011.

Summary

We haven't generated a summary for this paper yet.