Secant–intersection conjecture for PGCNN Hadamard parametrization

Establish that for any finite group G={g1,…,gn}, activation degree r≥2, and number of layers L≥1, for a general choice of PGCNN filters θ=(θ1,…,θL) in the Hadamard parametrization Φ, the n-dimensional linear span of the rth powers {σr(Φθ)(g1),…,σr(Φθ)(gn)} inside the space of degree r^L−1 homogeneous polynomials in n variables intersects the variety VP_{n,r,r^{L−1}} of rth powers of degree r^{L−1} polynomials only at the points {σr(Φθ)(g1),…,σr(Φθ)(gn)} themselves.

Background

The paper studies polynomial group convolutional neural networks (PGCNNs) for finite groups using graded group algebras. Two parametrizations are introduced for the activation of degree r: a Kronecker-based parametrization ϕ and a Hadamard-based parametrization Φ, related by a linear map. The geometry of the associated neuromanifolds and fibers is analyzed.

For the Kronecker parametrization ϕ, the authors prove a complete description of the general fiber up to the regular group action and rescaling. For the Hadamard parametrization Φ, they aim to obtain the same fiber description but reduce the problem to a conjectural statement about the intersection of special secant spans with the variety VP_{n,k,d} of k-th powers of degree d polynomials.

Specifically, for each group element g∈G, Φθ(g) is a homogeneous polynomial of degree r{L−1} in n variables, and σr(Φθ)(g) denotes its rth power. The set {σr(Φθ)(g): g∈G} spans an n-dimensional linear space in Sym_K(x, r·r{L−1}). The conjecture asserts that the intersection of this specialized secant span with VP_{n,r,r{L−1}} consists only of these n points. Proving this conjecture would imply the desired classification of the general fiber of Φ (as shown by the subsequent theorem stated conditionally on the conjecture).

References

Conjecture Let G={g_1, \dots, g_n} as a set, and let the activation degree r\geq 2. For general filters \theta=(\theta_1, \dots, \theta_L), where L\geq 1 we have

<\sigma_r(\Phi_\theta)(g_1), \dots, \sigma_r(\Phi_\theta)(g_n)> \cap\ VP_{n, r, r{L-1} = {\sigma_r(\Phi_\theta)(g_1), \dots, \sigma_r(\Phi_\theta)(g_n)}.

The Geometry of Polynomial Group Convolutional Neural Networks  (2603.29566 - Hendi et al., 31 Mar 2026) in Conjecture (labelled as Conjecture \ref{conj: intersection_of_general_secant_is_trivial}), Section 4.3: General fiber of Φ