Learning threshold functions of sparse polynomials using only queries

Establish whether threshold functions of sparse polynomials over {0,1}^n, specifically f(x) = sign(∑_{i=1}^d a_i χ_{S_i}(x) + a) with at most d nonzero terms, can be efficiently learned using only query access to f, with polynomial-time and polynomial-query complexity.

Background

The authors connect conditional-sampling-only learning of low logit rank models to classical query learning of Boolean functions. They argue that a positive resolution to learning low logit rank models from conditional samples would imply efficient query-learning of threshold functions of sparse polynomials.

They note that this implication targets a longstanding challenge in learning theory, and explicitly state that this appears to be an open question in the literature.

References

Thus, a positive answer to \cref{prob:conditional-sampling} (with exponentially small approximation error) would allow us to efficiently learn threshold functions of sparse polynomials using ony queries, which appears to be an open question in the literature.

Provably Learning from Modern Language Models via Low Logit Rank (2512.09892 - Golowich et al., 10 Dec 2025) in Conclusions and Future Directions, Learning from conditional samples paragraph