Dice Question Streamline Icon: https://streamlinehq.com

Optimal test loss at the interpolation threshold for grid-extended KANs

Prove that for Kolmogorov–Arnold Networks trained via grid extension on supervised function-fitting tasks, the test loss is minimized near the interpolation threshold where the number of trainable parameters equals the number of training samples.

Information Square Streamline Icon: https://streamlinehq.com

Background

The authors observe U-shaped test-loss behavior when progressively fine-graining spline grids and hypothesize that the minimum occurs at the interpolation threshold determined by matching parameter count to sample count.

They provide a concrete example estimating the threshold from the relationship between grid intervals and total parameters, motivating a general claim requiring proof.

References

We conjecture that the optimal test loss is achieved at the interpolation threshold when the number of parameters match the number of data points.

KAN: Kolmogorov-Arnold Networks (2404.19756 - Liu et al., 30 Apr 2024) in Subsection 2.4, For accuracy: Grid Extension (Toy example: staircase-like loss curves)