Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 43 tok/s Pro
GPT-5 Medium 28 tok/s
GPT-5 High 35 tok/s Pro
GPT-4o 94 tok/s
GPT OSS 120B 476 tok/s Pro
Kimi K2 190 tok/s Pro
2000 character limit reached

Functional-Edged Network Modeling (2404.00218v2)

Published 30 Mar 2024 in stat.ML and cs.LG

Abstract: Contrasts with existing works which all consider nodes as functions and use edges to represent the relationships between different functions. We target at network modeling whose edges are functional data and transform the adjacency matrix into a functional adjacency tensor, introducing an additional dimension dedicated to function representation. Tucker functional decomposition is used for the functional adjacency tensor, and to further consider the community between nodes, we regularize the basis matrices to be symmetrical. Furthermore, to deal with irregular observations of the functional edges, we conduct model inference to solve a tensor completion problem. It is optimized by a Riemann conjugate gradient descent method. Besides these, we also derive several theorems to show the desirable properties of the functional edged network model. Finally, we evaluate the efficacy of our proposed model using simulation data and real metro system data from Hong Kong and Singapore.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (47)
  1. M. Ou, P. Cui, J. Pei, Z. Zhang, and W. Zhu, “Asymmetric transitivity preserving graph embedding,” in Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, 2016, pp. 1105–1114.
  2. P. Goyal and E. Ferrara, “Graph embedding techniques, applications, and performance: A survey,” Knowledge-Based Systems, vol. 151, pp. 78–94, 2018.
  3. I. Gallagher, A. Jones, A. Bertiger, C. E. Priebe, and P. Rubin-Delanchy, “Spectral embedding of weighted graphs,” Journal of the American Statistical Association, pp. 1–10, 2023.
  4. S. M. Kazemi, R. Goel, K. Jain, I. Kobyzev, A. Sethi, P. Forsyth, and P. Poupart, “Representation learning for dynamic graphs: A survey,” The Journal of Machine Learning Research, vol. 21, no. 1, pp. 2648–2720, 2020.
  5. C. Zhong, E. Manley, S. M. Arisona, M. Batty, and G. Schmitt, “Measuring variability of mobility patterns from multiday smart-card data,” Journal of Computational Science, vol. 9, pp. 125–130, 2015.
  6. Q. Xu, B. Mao, and Y. Bai, “Network structure of subway passenger flows,” Journal of Statistical Mechanics: Theory and Experiment, vol. 2016, no. 3, p. 033404, 2016.
  7. P. Nomikos and J. F. MacGregor, “Multivariate spc charts for monitoring batch processes,” Technometrics, vol. 37, no. 1, pp. 41–59, 1995.
  8. K. Paynabar, C. Zou, and P. Qiu, “A change-point approach for phase-i analysis in multivariate profile monitoring and diagnosis,” Technometrics, vol. 58, no. 2, pp. 191–204, 2016.
  9. C. Zhang, H. Yan, S. Lee, and J. Shi, “Weakly correlated profile monitoring based on sparse multi-channel functional principal component analysis,” IISE Transactions, vol. 50, no. 10, pp. 878–891, 2018.
  10. N. Z. Foutz and W. Jank, “Research note—prerelease demand forecasting for motion pictures using functional shape analysis of virtual stock markets,” Marketing Science, vol. 29, no. 3, pp. 568–579, 2010.
  11. J. L. Bali, G. Boente, D. E. Tyler, and J.-L. Wang, “Robust functional principal components: A projection-pursuit approach,” The Annals of Statistics, vol. 39, no. 6, pp. 2852–2882, 2011.
  12. F. Yao, H.-G. Müller, and J.-L. Wang, “Functional data analysis for sparse longitudinal data,” Journal of the American statistical association, vol. 100, no. 470, pp. 577–590, 2005.
  13. S. Greven, C. Crainiceanu, B. Caffo, and D. Reich, “Longitudinal functional principal component analysis,” in Recent Advances in Functional Data Analysis and Related Topics.   Springer, 2011, pp. 149–154.
  14. J. A. Rice and C. O. Wu, “Nonparametric mixed effects models for unequally sampled noisy curves,” Biometrics, vol. 57, no. 1, pp. 253–259, 2001.
  15. H. L. Shang, “A survey of functional principal component analysis,” AStA Advances in Statistical Analysis, vol. 98, no. 2, pp. 121–142, 2014.
  16. G. M. James, T. J. Hastie, and C. A. Sugar, “Principal component models for sparse functional data,” Biometrika, vol. 87, no. 3, pp. 587–602, 2000.
  17. G. M. James and C. A. Sugar, “Clustering for sparsely sampled functional data,” Journal of the American Statistical Association, vol. 98, no. 462, pp. 397–408, 2003.
  18. L. Zhou, J. Z. Huang, and R. J. Carroll, “Joint modelling of paired sparse functional data using principal components,” Biometrika, vol. 95, no. 3, pp. 601–619, 2008.
  19. X. Qiao, S. Guo, and G. M. James, “Functional graphical models,” Journal of the American Statistical Association, vol. 114, no. 525, pp. 211–222, 2019.
  20. X. Qiao, C. Qian, G. M. James, and S. Guo, “Doubly functional graphical models in high dimensions,” Biometrika, vol. 107, no. 2, pp. 415–431, 2020.
  21. H. Zhu, N. Strawn, and D. B. Dunson, “Bayesian graphical models for multivariate functional data,” 2016.
  22. H. Chun, M. Chen, B. Li, and H. Zhao, “Joint conditional gaussian graphical models with multiple sources of genomic data,” Frontiers in genetics, vol. 4, p. 294, 2013.
  23. P. Danaher, P. Wang, and D. M. Witten, “The joint graphical lasso for inverse covariance estimation across multiple classes,” Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 76, no. 2, pp. 373–397, 2014.
  24. K. Tsai, B. Zhao, S. Koyejo, and M. Kolar, “Latent multimodal functional graphical model estimation,” Journal of the American Statistical Association, no. just-accepted, pp. 1–25, 2023.
  25. B. Li and E. Solea, “A nonparametric graphical model for functional data with application to brain networks based on fmri,” Journal of the American Statistical Association, vol. 113, no. 524, pp. 1637–1655, 2018.
  26. E. Solea and H. Dette, “Nonparametric and high-dimensional functional graphical models,” Electronic Journal of Statistics, vol. 16, no. 2, pp. 6175–6231, 2022.
  27. C. Matias and V. Miele, “Statistical clustering of temporal networks through a dynamic stochastic block model,” Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 79, no. 4, pp. 1119–1141, 2017.
  28. K. H. Lee, L. Xue, and D. R. Hunter, “Model-based clustering of time-evolving networks through temporal exponential-family random graph models,” Journal of multivariate analysis, vol. 175, p. 104540, 2020.
  29. P. D. Hoff, “Multilinear tensor regression for longitudinal relational data,” The annals of applied statistics, vol. 9, no. 3, p. 1169, 2015.
  30. C. Heaukulani and Z. Ghahramani, “Dynamic probabilistic models for latent feature propagation in social networks,” in International Conference on Machine Learning.   PMLR, 2013, pp. 275–283.
  31. D. K. Sewell and Y. Chen, “Latent space approaches to community detection in dynamic networks,” Bayesian analysis, vol. 12, no. 2, pp. 351–377, 2017.
  32. P. D. Hoff, A. E. Raftery, and M. S. Handcock, “Latent space approaches to social network analysis,” Journal of the american Statistical association, vol. 97, no. 460, pp. 1090–1098, 2002.
  33. J. D. Loyal and Y. Chen, “An eigenmodel for dynamic multilayer networks,” Journal of Machine Learning Research, vol. 24, no. 128, pp. 1–69, 2023.
  34. L. F. Robinson and C. E. Priebe, “Detecting time-dependent structure in network data via a new class of latent process models,” arXiv preprint arXiv:1212.3587, 2012.
  35. C. Zhang, B. Zheng, and F. Tsung, “Multi-view metro station clustering based on passenger flows: a functional data-edged network community detection approach,” Data Mining and Knowledge Discovery, vol. 37, no. 3, pp. 1154–1208, 2023.
  36. E. E. Papalexakis, C. Faloutsos, and N. D. Sidiropoulos, “Parcube: Sparse parallelizable tensor decompositions,” in Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2012, Bristol, UK, September 24-28, 2012. Proceedings, Part I 23.   Springer, 2012, pp. 521–536.
  37. L. Xiong, X. Chen, T.-K. Huang, J. Schneider, and J. G. Carbonell, “Temporal collaborative filtering with bayesian probabilistic tensor factorization,” in Proceedings of the 2010 SIAM international conference on data mining.   SIAM, 2010, pp. 211–222.
  38. W. Yu, W. Cheng, C. C. Aggarwal, H. Chen, and W. Wang, “Link prediction with spatial and temporal consistency in dynamic networks.” in IJCAI, 2017, pp. 3343–3349.
  39. M. J. McNeil, L. Zhang, and P. Bogdanov, “Temporal graph signal decomposition,” in Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2021, pp. 1191–1201.
  40. T. Yokota, Q. Zhao, and A. Cichocki, “Smooth parafac decomposition for tensor completion,” IEEE Transactions on Signal Processing, vol. 64, no. 20, pp. 5423–5436, 2016.
  41. R. Han, P. Shi, and A. R. Zhang, “Guaranteed functional tensor singular value decomposition,” Journal of the American Statistical Association, pp. 1–13, 2023.
  42. H. Wang, F. Nie, and H. Huang, “Low-rank tensor completion with spatio-temporal consistency,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28, no. 1, 2014.
  43. J. Zhou, W. W. Sun, J. Zhang, and L. Li, “Partially observed dynamic tensor response regression,” Journal of the American Statistical Association, vol. 118, no. 541, pp. 424–439, 2023.
  44. X. Chen and L. Sun, “Bayesian temporal factorization for multidimensional time series prediction,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 9, pp. 4659–4673, 2021.
  45. M. Imaizumi and K. Hayashi, “Tensor decomposition with smoothness,” in International Conference on Machine Learning.   PMLR, 2017, pp. 1597–1606.
  46. M. Rogers, L. Li, and S. J. Russell, “Multilinear dynamical systems for tensor time series,” Advances in Neural Information Processing Systems, vol. 26, 2013.
  47. W. Hu, D. Tao, W. Zhang, Y. Xie, and Y. Yang, “A new low-rank tensor model for video completion,” arXiv preprint arXiv:1509.02027, 2015.
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (2)

X Twitter Logo Streamline Icon: https://streamlinehq.com