Truncated Least Squares (TLS)
- Truncated Least Squares (TLS) is a robust estimation framework that selectively ignores large residuals to mitigate errors and improve accuracy.
- Variants like TTLS and per-residual truncation reduce sensitivity to outliers and ill-conditioning in regression and point cloud registration.
- Advanced algorithms—including randomized and quantum-inspired methods—enable efficient truncated SVD and low-rank solutions for large-scale problems.
Truncated Least Squares (TLS)
Truncated Least Squares (TLS) is an umbrella term encompassing several robust and regularized variants of the classical least squares and total least squares (TLS) frameworks, whose principal aim is to improve estimation accuracy or robustness by selectively ignoring, truncating, or thresholding the influence of large residuals or small singular value modes. These approaches address high-sensitivity to ill-conditioning, outliers, or errors-in-variables, and arise in numerous data-fitting, signal processing, and geometric registration contexts. The two most prominent lines within the "truncated least squares" category are truncated total least squares (TTLS)—which applies hard singular value truncation to the augmented TLS system—and robust TLS estimators that employ per-residual truncation (e.g., via the min function), often leading to combinatorial or non-smooth optimization problems.
1. Classical and Truncated Total Least Squares Formulation
Consider a linear regression system , where both and are subject to errors. The total least squares (TLS) approach jointly estimates corrections and to and to minimize the Frobenius norm of the combined perturbation, subject to . This is equivalent to seeking the minimal perturbation such that the augmented matrix drops rank. The standard TLS solution (under a genericity condition on singular values) is given by
where is the right singular vector of corresponding to its smallest singular value (Xie et al., 2014, Zuo et al., 2022, Meng et al., 2020).
Classical TLS is highly sensitive in discrete ill-posed problems where the smallest singular values of accumulate near zero. To regularize, Truncated TLS (TTLS) restricts the solution to the leading singular value directions. The TTLS estimate is obtained by forming the truncated SVD
and, after appropriate partitioning of the right singular vectors ,
where contains the first columns (excluding the final component) and is the trailing component (Xie et al., 2014, Zuo et al., 2022).
2. Robust Truncated Least Squares in Point Cloud Registration
A distinct usage of truncated least squares arises in geometric matching and robust estimation, notably in point cloud registration under high outlier rates (Ivanov et al., 21 Aug 2025). Here, the per-point squared residual is truncated: where , and is a user-defined threshold. This function penalizes only inlier residuals quadratically—outlier residuals saturate at and do not further influence the cost—yielding extreme outlier robustness. The resulting optimization is combinatorial, as the inlier set is data-dependent and unknown.
Efficient globally optimal solvers employ a combination of convex relaxation (e.g., weighted least squares underestimation), branch-and-bound with pruning (interval contraction based on inlier likelihood), and streamlined active-set routines for nonlinear parameterizations such as rotations. These methods can certify global optimality and solve large-scale registration tasks with high percentages of outliers (Ivanov et al., 21 Aug 2025).
3. Truncated SVD and Low-Rank TTLS Algorithms
In large-scale and ill-posed TLS settings, the truncated SVD approach is foundational to TTLS regularization. Instead of directly computing the full SVD of , randomized algorithms generate low-dimensional subspaces that approximate the dominant singular vectors with high probability and low computational cost. For example, given and target rank , a typical algorithm (Xie et al., 2014, Zuo et al., 2022) proceeds as follows:
- Generate a Gaussian random test matrix or use -norm sampling to rapidly construct a "sketch" , where the range of approximates the top right singular subspace.
- Orthonormalize to obtain , then form a small matrix and compute its truncated SVD.
- Lift the singular vectors back to , partition as before, and compute the TTLS solution in the lower-dimensional subspace.
These techniques reduce computation from to , with accuracy controlled by the singular value gap and the oversampling parameter (Xie et al., 2014, Zuo et al., 2022). A quantum-inspired variant (QiTTLS) achieves similar accuracy but leverages advanced sampling data structures for even faster sketch construction (Zuo et al., 2022).
4. Perturbation Analysis and Conditioning of TTLS
The sensitivity of TTLS solutions to perturbations in and is captured by condition numbers. Explicit normwise, mixed, and componentwise condition number formulas have been derived. The mapping is linearized, and the leading-order derivative is represented by a Kronecker-product based operator . The absolute normwise condition number is
whereas mixed and componentwise measures provide much sharper and scale-aware bounds, especially significant when data or solution variables are poorly scaled or sparse (Meng et al., 2020).
For structured problems (e.g., restricted to a subspace), structure-preserving condition numbers can be computed; these are always less than or equal to their unstructured counterparts. Small-sample statistical condition estimation (SCE) algorithms provide practical and efficient empirical estimates of all such condition numbers by probing the gradient in random directions, using only the precomputed SVD of (Meng et al., 2020).
5. Reduced-Rank Analysis, Limitations, and Special Cases
In ordinary least squares (LS), truncating the SVD reduces solution variance at the cost of a controlled increase in bias, leading to a favorable distortion-variance tradeoff that can be exploited using data-dependent ordering of singular directions. In the total least squares case, a naïve application of such truncated projections fails: the optimal truncation level depends on the unknown parameter vector's norm, preventing any adaptive, data-driven scheme from guaranteeing mean square error (MSE) reduction over the full-rank estimator for arbitrary data (Nagananda et al., 2019).
Reduced-rank TLS estimators only arise in special scenarios—for example, when the parameter is known a priori to be norm-bounded, or if side information such as sparsity or prior distributions is available. In these specialized settings, truncation can be meaningfully adapted, but otherwise, the TLS structure precludes effective purely data-driven rank selection (Nagananda et al., 2019).
6. Structured and Convex Relaxation Approaches
Structured TLS (STLS) generalizes the standard model to include heterogeneous error variances, block, Toeplitz, or Hankel error constraints, and robust error norms. These structured formulations are nonconvex and do not admit closed-form solutions. Convex relaxation methods, central among them nuclear norm minimization and its reweighted (log-det) variants, provide tractable surrogates for the rank constraint. In standard (unstructured) TLS, nuclear norm relaxation incurs an factor increase in approximation error, but reweighted relaxations can nearly match the exact TLS SVD solution (Malioutov et al., 2014).
The algorithms can be efficiently implemented using augmented Lagrangian or ADMM schemes. Experimentally, reweighted relaxations outperform both nonconvex optimizers and unweighted convex surrogates, and are highly effective in challenging, application-specific structured TLS problems, robust PCA, and low-rank matrix completion (Malioutov et al., 2014).
7. Practical and Algorithmic Considerations
- TTLS effectiveness depends crucially on the truncation index , typically chosen based on the singular value gap or cross-validation/regularization heuristics (Xie et al., 2014, Zuo et al., 2022).
- For large-scale systems, randomized and quantum-inspired algorithms provide substantial speedups and accurate solutions, as confirmed by benchmark tests on ill-posed problems and applications such as frequency estimation and tomography (Xie et al., 2014, Zuo et al., 2022, Meng et al., 2020).
- In robust point cloud registration, branch-and-bound solvers exploiting truncation achieve state-of-the-art global optimality and scalability, tolerating outlier rates exceeding 90% (Ivanov et al., 21 Aug 2025).
- Perturbation and condition number analyses, especially those leveraging componentwise or structured counters, offer reliable a posteriori error bounds for practical implementations (Meng et al., 2020).
References
- (Xie et al., 2014) Perturbation Analysis and Randomized Algorithms for Large-Scale Total Least Squares Problems
- (Zuo et al., 2022) Quantum-inspired algorithm for truncated total least squares solution
- (Meng et al., 2020) Condition numbers for the truncated total least squares problem and their estimations
- (Malioutov et al., 2014) Convex Total Least Squares
- (Ivanov et al., 21 Aug 2025) Fast globally optimal Truncated Least Squares point cloud registration with fixed rotation axis
- (Nagananda et al., 2019) Reduced-rank Analysis of the Total Least Squares