Validity of convex combinations of update rules from different majorizers
Establish whether, in the MM framework for Sparse Bayesian Learning, a convex combination of update rules derived from different convex majorizers is itself a valid update rule that guarantees descent, beyond the specific sufficient conditions provided in the paper.
References
However, the validity of the convex combination of update rules corresponding to different majorizers is still open.
— Sparse Bayesian Learning Algorithms Revisited: From Learning Majorizers to Structured Algorithmic Learning using Neural Networks
(2604.02513 - Balaji et al., 2 Apr 2026) in Section 4.1, Convex Combination of Update Rules (immediately before Theorem 2)