Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Vector Subdivision Schemes for Arbitrary Matrix Masks (2209.08136v1)

Published 16 Sep 2022 in math.NA and cs.NA

Abstract: Employing a matrix mask, a vector subdivision scheme is a fast iterative averaging algorithm to compute refinable vector functions for wavelet methods in numerical PDEs and to produce smooth curves in CAGD. In sharp contrast to the well-studied scalar subdivision schemes, vector subdivision schemes are much less well understood, e.g., Lagrange and (generalized) Hermite subdivision schemes are the only studied vector subdivision schemes in the literature. Because many wavelets used in numerical PDEs are derived from refinable vector functions whose matrix masks are not from Hermite subdivision schemes, it is necessary to introduce and study vector subdivision schemes for any general matrix masks in order to compute wavelets and refinable vector functions efficiently. For a general matrix mask, we show that there is only one meaningful way of defining a vector subdivision scheme. Motivated by vector cascade algorithms and recent study on Hermite subdivision schemes, we shall define a vector subdivision scheme for any arbitrary matrix mask and then we prove that the convergence of the newly defined vector subdivision scheme is equivalent to the convergence of its associated vector cascade algorithm. We also study convergence rates of vector subdivision schemes. The results of this paper not only bridge the gaps and establish intrinsic links between vector subdivision schemes and vector cascade algorithms but also strengthen and generalize current known results on Lagrange and (generalized) Hermite subdivision schemes. Several examples are provided to illustrate the results in this paper on various types of vector subdivision schemes with convergence rates.

Citations (2)

Summary

We haven't generated a summary for this paper yet.