Dice Question Streamline Icon: https://streamlinehq.com

KAN advantages on Feynman datasets

Ascertain whether the Feynman_no_units datasets predominantly exhibit smooth or monotonic variable dependencies that limit performance gains of Kolmogorov–Arnold Networks over Multi-Layer Perceptrons, by characterizing dataset complexity (e.g., oscillation and compositional structure) and empirically testing its impact on model comparisons.

Information Square Streamline Icon: https://streamlinehq.com

Background

The authors find comparable performance between KANs and MLPs on the Feynman datasets and hypothesize that the datasets’ relationships are too simple (smooth/monotonic) for KANs to demonstrate strong advantages.

Validating or refuting this conjecture requires a principled complexity analysis of the datasets and controlled experiments linking function properties to model performance.

References

We conjecture that the Feynman datasets are too simple to let KANs make further improvements, in the sense that variable dependence is usually smooth or monotonic, which is in contrast to the complexity of special functions which often demonstrate oscillatory behavior.

KAN: Kolmogorov-Arnold Networks (2404.19756 - Liu et al., 30 Apr 2024) in Subsection 3.3, Feynman datasets