Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 37 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 125 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 429 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Limited-Angle Reconstruction Kernel (LARK)

Updated 12 October 2025
  • LARK is a kernel-based framework that stabilizes limited-angle tomography by constructing specialized inversion kernels for incomplete angular data.
  • It employs both model-driven and RKHS-based approaches to integrate spectral filtering, sparsity priors, and edge-preserving regularization for enhanced image recovery.
  • The method improves reconstruction fidelity by reducing artifacts through dimension reduction, rigorous error analysis, and adaptive denoising techniques.

Limited-Angle Reconstruction Kernel (LARK) denotes a class of mathematical constructs and algorithmic methods that adapt the inversion of tomographic data to the case of incomplete angular sampling, with the goal of overcoming the severe ill-posedness characteristic of limited-angle tomography. LARK formalizes reconstruction as the application of kernel operators, either in representer-theoretic (e.g., function interpolation, RKHS-based) or model-driven (e.g., singular decomposition, curvelet/sparse domain) frameworks. The concept encompasses both analytical and computationally adaptive kernels, and is associated with robust, edge-preserving, or information-theoretically justified recovery in a variety of settings, whether spatially or in transform domains.

1. Foundations and Theoretical Motivation

The impetus for LARK arises from the limited-angle tomography problem, in which projection data are available only over a subset of directions (e.g., [Φ,Φ][\,-\Phi,\Phi\,], with Φ<π/2\Phi<\pi/2), commonly due to physical constraints or dose limitations. Mathematically, the inverse problem is modeled as: yδ=RΦf+ηy^\delta = R_\Phi f + \eta where RΦR_\Phi is the limited-angle Radon transform, ff is the object to be estimated, and η\eta denotes noise. In this context, the classical analytic inversion (e.g., filtered backprojection, FBP) or standard iterative methods (e.g., ART, SART) become highly unstable, as whole classes of features—particularly certain directional singularities—lie in the null space of RΦR_\Phi (Frikel, 2011, Sironi, 2011).

LARK addresses this deficit by constructing specialized reconstruction kernels, often engineered by exploiting the analytical structure of RΦRΦR_\Phi^*R_\Phi (or directly from SVD/approximate inverse methods). These kernels either adapt to the available data by design or employ sparsity and directionality priors (e.g., curvelets, wavelets, adaptive bases) so as to preferentially reconstruct the visible information and stabilize or regularize the missing components (Frikel, 2011, Sironi, 2011, Hahn et al., 5 Oct 2025).

2. Kernel Constructions: Variants and Mathematical Frameworks

LARK constructions broadly fall into two categories:

(a) Model-Driven Approximate Inverse Kernels:

This approach reframes the inversion as computing, for each reconstruction site xx, the application of a linear functional on the measurement data using a precomputed kernel ψxγ\psi_x^\gamma: Sγg(x)=g,ψxγ,RΦψxγ=exγS^\gamma g(x) = \langle g, \psi_x^\gamma \rangle, \quad R_\Phi^* \psi_x^\gamma = e_x^\gamma where exγe_x^\gamma is a mollifier centered at xx. The kernel ψxγ\psi_x^\gamma is generally represented through the SVD of RΦR_\Phi: ψxγ,τ,n=m=0n=0mFτ(σml)σmlexγ,vmluml\psi_x^{\gamma, \tau, n} = \sum_{m=0}^n \sum_{\ell=0}^m \frac{F_\tau(\sigma_{ml})}{\sigma_{ml}} \langle e_x^\gamma, v_{ml} \rangle u_{ml} with FτF_\tau a spectral filter, σml\sigma_{ml} singular values, and uml,vmlu_{ml}, v_{ml} the singular functions (Hahn et al., 5 Oct 2025). This structure allows explicit incorporation of the kernel characterization of the limited-angle operator, identification of the invisible spectrum, and direct regularization of the ill-posed inverse.

(b) Representer and RKHS-Driven Adaptive Kernels:

Alternatively, the reconstruction is formulated through kernel expansions: f(x)j=1NcjK(x,yj)f(x) \approx \sum_{j=1}^N c_j K(x, y_j) where KK is a positive-definite kernel, yjy_j a set of centers, and cjc_j coefficients estimated via data-matching of the kernel's Radon transform. In LARK, KK is engineered (e.g., via angular weighting, tailored spectrum) to be robust under missing projection data (Sironi, 2011). Regularization is typically imposed via Tikhonov-penalized least squares: mincAcd2+λLc2\min_{c} \|\mathbf{A}c - d\|^2 + \lambda\|\mathbf{L}c\|^2 with A\mathbf{A} the matrix of projected kernels.

A key aspect is that the Radon transform of the kernel basis function must remain stable as the available angular range shrinks—a technical challenge requiring careful control of kernel singularities and possibly auxiliary optimization for angular adaptation.

3. Sparse and Directional Methods: Curvelets and Visibility

A core advancement in LARK methodology involves embedding the reconstruction in a frame that inherently encodes geometric priors, most notably via curvelets. Representing the object as

f=ncnψnf = \sum_n c_n \psi_n

where ψn\psi_n are curvelets and cnc_n are the sought coefficients, the inversion becomes an 1\ell_1-penalized Tikhonov minimization in the curvelet domain (Frikel, 2011): mincRN12Kcyδ2+nwncn\min_{c\in\mathbb{R}^N} \, \frac{1}{2}\|Kc - y^\delta\|^2 + \sum_n w_n|c_n| with K=RΦTK = R_\Phi T^* (the composition of the limited-angle Radon transform and curvelet synthesis).

A main theoretical result is the explicit identification of "visible" and "invisible" curvelet coefficients:

  • The set of indices for which the curvelet's frequency support does not intersect the observable wedge (ω>Φ|\,\omega\,| > \Phi) corresponds to the null space of RΦR_\Phi: IΦinvisible={(j,,k):suppψj,,kWΦ=}\mathcal{I}_\Phi^{\mathrm{invisible}} = \{(j, \ell, k):\, \operatorname{supp} \psi_{j, \ell, k} \cap W_\Phi = \emptyset\}
  • Reconstructions via LARK using such a sparse regularization necessarily set cj,,k=0c_{j,\ell,k}=0 for all (j,,k)IΦinvisible(j,\ell,k) \in \mathcal{I}_\Phi^{\mathrm{invisible}}.

Thus, the reconstruction kernel is effectively "dimension-reduced" by restricting to the visible frame elements, yielding a problem of significantly lower effective dimension, improved conditioning, and faster solvers (the so-called A-CSR formulation) (Frikel, 2011).

4. Regularization and Stabilization: Noise, Artifacts, and CLARK

LARK implementations are acutely sensitive to the exponential decay of singular values in RΦR_\Phi—the root of severe instability and the amplification of noise or missing data frequency artifacts. The broad strategy combines spectral filtering (attenuation rather than truncation) in the SVD/Singular function expansion and explicit regularization in both the data and object domains:

  • Spectral filter Fτ(σ)F_\tau(\sigma) applied as Fτ(σ)=σ2σ2+τF_\tau(\sigma)=\frac{\sigma^2}{\sigma^2+\tau}
  • Smoothing mollifier exγe_x^\gamma to restrict high-frequency amplification
  • Constraints or penalties like total variation on the reconstructed object or functional outputs

The CLARK scheme augments LARK by introducing an edge-preserving denoiser or a penalty functional in the data space,

Dλ(gδ)=argming12ggδ2+λP(Sγ(g))D_\lambda(g^\delta) = \arg\min_{g} \frac{1}{2}\|g - g^\delta\|^2 + \lambda P(S^\gamma(g))

so that the final reconstruction fγ,λ=Sγ(Dλ(gδ))f^{\gamma, \lambda} = S^\gamma(D_\lambda(g^\delta)) combines kernel inversion with variational denoising (Hahn et al., 5 Oct 2025). This dual regularization helps suppress both the classical streak artifacts and the "wave-type" artifacts induced by the smallest singular functions in the severely limited-angle case.

5. Computational Implementation and Error Analysis

Practical LARK deployment involves:

  • Discrete representation of the forward problem, possibly using interpolating kernels or radial basis expansions for the image domain
  • Evaluation of discrete Radon transforms and their adjoints
  • Computation of the SVD (or truncated SVD) of the discrete limited-angle forward operator; construction of discrete reconstruction kernel matrices
  • Application of the SVD-based spectral filter and assembly of the full kernel bank for all reconstruction positions

Empirical error estimates, supported analytically, quantify reconstruction accuracy as a function of grid density (hh), the norm of the kernel operator, and the regularity of the target (see formula (6) in (Hahn et al., 5 Oct 2025)). As grid refinement increases and with appropriately controlled operator norm, the error converges to zero for ideal, noiseless, and full-angle data, and remains bounded (though dependent on the stability of the limited-angle kernel) in the practical setting.

Quantitative validation on synthetic (e.g., Shepp–Logan) and real data sets (e.g., Helsinki Tomography Challenge) has demonstrated that—with proper regularization—LARK and especially CLARK can successfully suppress standard limited-angle artifacts, fill in partially missing features, and recover geometric detail notably better than FBP or conventional TV-based approaches as the angular range narrows.

6. Comparison with Classical and Data-Driven Approaches

LARK stands apart from:

  • FBP and TV methods: which, when naively applied in the limited-angle case, are prone to streak artifacts, edge blurring, and loss of directional detail (Frikel, 2011, Hahn et al., 5 Oct 2025).
  • Iterative algebraic schemes: which often require heuristic regularization and cannot exploit explicit characterizations of the invisible spectrum.
  • Learning-based approaches: which typically rely on large datasets and do not provide analytical control of the forward/inverse operator structure (though LARK can act as a principled initial guess or architectural component for such schemes).

By contrast, LARK provides an explicit analytical underpinning (singular characterization, edge directionality, kernel selectivity), direct dimension reduction, and a modular pathway for hybrid schemes—e.g., initialization or combination with learned denoisers in modern networks.

7. Future Directions and Implications

LARK and its regularized extensions (e.g., CLARK) offer future potential as the kernel backbone for hybrid model-driven and data-driven tomographic reconstruction pipelines, especially in cases where ground-truth data are scarce and interpretability is critical. Precomputed, well-characterized kernels tailored to acquisition geometry can serve as robust priors or initialization for further refinement steps (variational or learned), or as a means to impose physics-aware constraints in modern optimization or deep learning frameworks. The explicit error analysis and flexibility in regularization choices enhance their adaptability to a wide array of inverse problems in both clinical and industrial tomography (Frikel, 2011, Hahn et al., 5 Oct 2025).

In sum, LARK encapsulates a rigorous, kernel-based formalism for the analytic and computational stabilization of limited-angle CT reconstruction, bridging the gap between classical inversion theory and the algorithmic advances necessitated by severe angular data incompleteness.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (3)
Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Limited-Angle Reconstruction Kernel (LARK).

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube