Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 149 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 92 tok/s Pro
Kimi K2 196 tok/s Pro
GPT OSS 120B 425 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Gradient-Weighted, Data-Driven Normalization for Approximate Border Bases -- Concept and Computation (2506.09529v1)

Published 11 Jun 2025 in cs.SC and math.AC

Abstract: This paper studies the concept and the computation of approximately vanishing ideals of a finite set of data points. By data points, we mean that the points contain some uncertainty, which is a key motivation for the approximate treatment. A careful review of the existing border basis concept for an exact treatment motivates a new adaptation of the border basis concept for an approximate treatment. In the study of approximately vanishing polynomials, the normalization of polynomials plays a vital role. So far, the most common normalization in computational commutative algebra uses the coefficient norm of a polynomial. Inspired by recent developments in machine learning, the present paper proposes and studies the use of gradient-weighted normalization. The gradient-weighted semi-norm evaluates the gradient of a polynomial at the data points. This data-driven nature of gradient-weighted normalization produces, on the one hand, better stability against perturbation and, on the other hand, very significantly, invariance of border bases with respect to scaling the data points. Neither property is achieved with coefficient normalization. In particular, we present an example of the lack of scaling invariance with respect to coefficient normalization, which can cause an approximate border basis computation to fail. This is extremely relevant because scaling of the point set is often recommended for preprocessing the data. Further, we use an existing algorithm with coefficient normalization to show that it is easily adapted to gradient-weighted normalization. The analysis of the adapted algorithm only requires tiny changes, and the time complexity remains the same. Finally, we present numerical experiments on three affine varieties to demonstrate the superior stability of our data-driven normalization over coefficient normalization. We obtain robustness to perturbations and invariance to scaling.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com
Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 8 likes.

Upgrade to Pro to view all of the tweets about this paper: