Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Solving All Regression Models For Learning Gaussian Networks Using Givens Rotations (1901.07643v2)

Published 22 Jan 2019 in cs.NA, cs.LG, and math.NA

Abstract: Score based learning (SBL) is a promising approach for learning Bayesian networks. The initial step in the majority of the SBL algorithms consists of computing the scores of all possible child and parent-set combinations for the variables. For Bayesian networks with continuous variables, a particular score is usually calculated as a function of the regression of the child over the variables in the parent-set. The sheer number of regressions models to be solved necessitates the design of efficient numerical algorithms. In this paper, we propose an algorithm for an efficient and exact calculation of regressions for all child and parent-set combinations. In the proposed algorithm, we use QR decompositions (QRDs) to capture the dependencies between the regressions for different families and Givens rotations to efficiently traverse through the space of QRDs such that all the regression models are accounted for in the shortest path possible. We compare the complexity of the suggested method with different algorithms, mainly those arising in all subset regression problems, and show that our algorithm has the smallest algorithmic complexity. We also explain how to parallelize the proposed method so as to decrease the runtime by a factor proportional to the number of processors utilized.

Citations (1)

Summary

We haven't generated a summary for this paper yet.