Instrumental Variable Regression: STIV Insights
- Instrumental variable regression is a method that uses external instruments to address endogeneity in causal inference.
- The STIV estimator employs convex optimization with l1 penalties to enforce sparsity and manage weak instruments in high-dimensional settings.
- Robust confidence sets and variable selection techniques ensure reliable inference even in complex models, as demonstrated in applications like the EASI demand system.
Instrumental variable regression is a central econometric and statistical method for inference on causal effects when explanatory variables are endogenous—i.e., correlated with the error term due to omitted variables, simultaneity, or measurement error. Modern variants address high-dimensional designs and weak identification by combining convex optimization, adaptive regularization, and identification-robust inference.
1. High-Dimensional Linear IV Modeling and Endogeneity
In high-dimensional IV regression, the number of regressors (potentially endogenous) and instruments can be comparable to or exceed the sample size . The model is
with the key IV moment restriction
$\E[Z U(\beta)] = 0,$
and the vector of instruments (possibly partially endogenous). The challenge is that only a sparse (or approximately sparse) subset of the high-dimensional parameter vector is nonzero, and proper identification must be maintained under many weak instruments and endogeneity.
2. The Self-Tuning Instrumental Variables (STIV) Estimator
To address high-dimensionality and endogeneity, the STIV estimator solves the convex program: where:
- and are diagonal scaling matrices (inverse sample standard deviations),
- is the set of regressors to be penalized for sparsity,
- is a penalization constant,
- is a data-driven tuning parameter for the moment violation,
- is an estimate of the error standard deviation.
A key feature is the penalty on a subset of , enforcing sparsity, and a constraint on the maximal empirical moment violation, rescaled by the estimated error standard deviation and tuning parameter. The entire program is a linear program (or, with alternative penalty, a conic program), solvable in polynomial time even with thousands of variables.
3. Identification-Robust Inference and Sensitivity Analysis via Linear Programming
Classical confidence intervals for IV estimates suffer in high dimensions and under weak instruments. STIV constructs identification-robust confidence sets by test inversion, relying on the pivotal statistic
$\widehat{t}(b) = \frac{\|\;D_Z\cdot\E_n [Z U(b)]\;\|_\infty}{\widehat{\sigma}(b)},$
with the robust confidence set given by the sublevel set .
Exact computation is infeasible in high dimension, so the approach convexifies the problem: it defines sensitivity constants
where is a cone encoding approximate sparsity. For any loss , the estimation error is bounded as
with the relevant constants and “inflation factor” . The sensitivity parameters are themselves computed via linear programs, making the robust confidence sets computable in polynomial time, irrespective of or .
The overall robust confidence set is
ensuring validity even when instruments are many and weak.
4. Convergence Rates, Variable Selection, and Adaptation to Sparsity
The estimator’s finite-sample error is nonasymptotically controlled: so error is determined by the chosen slack , noise level, model sparsity , and instrument/design complexity via .
Support recovery (“exact variable selection”) is obtained by thresholding: for each component,
If nonzero coefficients exceed these data-driven thresholds (a “beta-min” condition), their support is consistently identified.
5. Empirical Application: The EASI Demand System
The EASI (Exact Affine Stone Index) demand system is a flexible, high-dimensional model for household expenditure shares using series expansions or polynomials (often with thousands of regressors). Challenges include endogeneity of high-order terms, economic restrictions (homogeneity, symmetry), and approximate sparsity.
STIV is particularly suited because:
- The LP (or conic) structure allows direct solution with thousands of variables.
- Tuning (choice of and penalties) is self-calibrating, depending only on the data.
- Confidence sets remain accurate under weak identification.
- Variable selection procedures efficiently identify relevant basis components.
- In application, second-order (quadratic) EASI approximation, estimated with STIV, significantly reduces demand estimation error compared to first-order or conventional two-stage methods. The resulting confidence bands for Engel curves are robust and informative, indicating, for instance, valid identification of peer effects and price elasticities.
6. Implementation and Computational Aspects
STIV’s convex formulation allows direct, efficient computation via off-the-shelf LP or conic programming solvers, even in very high-dimensional problems. Sensitivity analysis and construction of robust confidence sets reduce to a small collection of LPs per confidence region or variable selection problem.
Typical workflow:
- Standardize and via , .
- Formulate and solve the STIV program for and .
- Solve linear programs for sensitivity constants as needed for confidence sets or error bounds.
- Apply thresholding rules for variable selection and support recovery.
- Construct robust, identification-valid confidence bands via the convexified region.
The approach is self-tuning: critical parameters like are set using moderate deviation bounds, making the procedure largely automatic.
7. Summary of Theoretical and Practical Contributions
The STIV framework enables inference in linear IV models with many endogenous regressors, controlling for weak identification, high-dimensionality, and approximate model sparsity. Key advances are:
- Convex (LP/conic) penalized estimation with self-normalizing moment constraints.
- Robust confidence sets via LP-based computation, valid uniformly over sizable identification regions.
- Nonasymptotic convergence rates scaling with sparsity and instrument strength.
- Data-adaptive variable selection with explicit beta-min thresholds.
- Empirical validation on complex structural systems where both model approximation and endogeneity are severe.
This methodology offers a statistically principled and computationally tractable approach for modern econometric inference in settings with large-scale, endogenous, and potentially weakly-identified systems (Gautier et al., 2011).